Nov 29 05:35:36 localhost kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 29 05:35:36 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 29 05:35:36 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 05:35:36 localhost kernel: BIOS-provided physical RAM map:
Nov 29 05:35:36 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 29 05:35:36 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 29 05:35:36 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 29 05:35:36 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 29 05:35:36 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 29 05:35:36 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 29 05:35:36 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 29 05:35:36 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 29 05:35:36 localhost kernel: NX (Execute Disable) protection: active
Nov 29 05:35:36 localhost kernel: APIC: Static calls initialized
Nov 29 05:35:36 localhost kernel: SMBIOS 2.8 present.
Nov 29 05:35:36 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 29 05:35:36 localhost kernel: Hypervisor detected: KVM
Nov 29 05:35:36 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 29 05:35:36 localhost kernel: kvm-clock: using sched offset of 3318926677 cycles
Nov 29 05:35:36 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 29 05:35:36 localhost kernel: tsc: Detected 2799.998 MHz processor
Nov 29 05:35:36 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 29 05:35:36 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 29 05:35:36 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 29 05:35:36 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 29 05:35:36 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 29 05:35:36 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 29 05:35:36 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 29 05:35:36 localhost kernel: Using GB pages for direct mapping
Nov 29 05:35:36 localhost kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 29 05:35:36 localhost kernel: ACPI: Early table checksum verification disabled
Nov 29 05:35:36 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 29 05:35:36 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:36 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:36 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:36 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 29 05:35:36 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:36 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:36 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 29 05:35:36 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 29 05:35:36 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 29 05:35:36 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 29 05:35:36 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 29 05:35:36 localhost kernel: No NUMA configuration found
Nov 29 05:35:36 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 29 05:35:36 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Nov 29 05:35:36 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 29 05:35:36 localhost kernel: Zone ranges:
Nov 29 05:35:36 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 29 05:35:36 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 29 05:35:36 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 05:35:36 localhost kernel:   Device   empty
Nov 29 05:35:36 localhost kernel: Movable zone start for each node
Nov 29 05:35:36 localhost kernel: Early memory node ranges
Nov 29 05:35:36 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 29 05:35:36 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 29 05:35:36 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 05:35:36 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 29 05:35:36 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 29 05:35:36 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 29 05:35:36 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 29 05:35:36 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 29 05:35:36 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 29 05:35:36 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 29 05:35:36 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 29 05:35:36 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 29 05:35:36 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 29 05:35:36 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 29 05:35:36 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 29 05:35:36 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 29 05:35:36 localhost kernel: TSC deadline timer available
Nov 29 05:35:36 localhost kernel: CPU topo: Max. logical packages:   8
Nov 29 05:35:36 localhost kernel: CPU topo: Max. logical dies:       8
Nov 29 05:35:36 localhost kernel: CPU topo: Max. dies per package:   1
Nov 29 05:35:36 localhost kernel: CPU topo: Max. threads per core:   1
Nov 29 05:35:36 localhost kernel: CPU topo: Num. cores per package:     1
Nov 29 05:35:36 localhost kernel: CPU topo: Num. threads per package:   1
Nov 29 05:35:36 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 29 05:35:36 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 29 05:35:36 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 29 05:35:36 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 29 05:35:36 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 29 05:35:36 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 29 05:35:36 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 29 05:35:36 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 29 05:35:36 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 29 05:35:36 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 29 05:35:36 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 29 05:35:36 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 29 05:35:36 localhost kernel: Booting paravirtualized kernel on KVM
Nov 29 05:35:36 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 29 05:35:36 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 29 05:35:36 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 29 05:35:36 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Nov 29 05:35:36 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 29 05:35:36 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 29 05:35:36 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 05:35:36 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 29 05:35:36 localhost kernel: random: crng init done
Nov 29 05:35:36 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 29 05:35:36 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 29 05:35:36 localhost kernel: Fallback order for Node 0: 0 
Nov 29 05:35:36 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 29 05:35:36 localhost kernel: Policy zone: Normal
Nov 29 05:35:36 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 29 05:35:36 localhost kernel: software IO TLB: area num 8.
Nov 29 05:35:36 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 29 05:35:36 localhost kernel: ftrace: allocating 49313 entries in 193 pages
Nov 29 05:35:36 localhost kernel: ftrace: allocated 193 pages with 3 groups
Nov 29 05:35:36 localhost kernel: Dynamic Preempt: voluntary
Nov 29 05:35:36 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 29 05:35:36 localhost kernel: rcu:         RCU event tracing is enabled.
Nov 29 05:35:36 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 29 05:35:36 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 29 05:35:36 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 29 05:35:36 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 29 05:35:36 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 29 05:35:36 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 29 05:35:36 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 05:35:36 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 05:35:36 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 05:35:36 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 29 05:35:36 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 29 05:35:36 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 29 05:35:36 localhost kernel: Console: colour VGA+ 80x25
Nov 29 05:35:36 localhost kernel: printk: console [ttyS0] enabled
Nov 29 05:35:36 localhost kernel: ACPI: Core revision 20230331
Nov 29 05:35:36 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 29 05:35:36 localhost kernel: x2apic enabled
Nov 29 05:35:36 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Nov 29 05:35:36 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 29 05:35:36 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 29 05:35:36 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 29 05:35:36 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 29 05:35:36 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 29 05:35:36 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 29 05:35:36 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 29 05:35:36 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 29 05:35:36 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 29 05:35:36 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 29 05:35:36 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 29 05:35:36 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 29 05:35:36 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 29 05:35:36 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 29 05:35:36 localhost kernel: x86/bugs: return thunk changed
Nov 29 05:35:36 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 29 05:35:36 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 29 05:35:36 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 29 05:35:36 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 29 05:35:36 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 29 05:35:36 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 29 05:35:36 localhost kernel: Freeing SMP alternatives memory: 40K
Nov 29 05:35:36 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 29 05:35:36 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 29 05:35:36 localhost kernel: landlock: Up and running.
Nov 29 05:35:36 localhost kernel: Yama: becoming mindful.
Nov 29 05:35:36 localhost kernel: SELinux:  Initializing.
Nov 29 05:35:36 localhost kernel: LSM support for eBPF active
Nov 29 05:35:36 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 05:35:36 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 05:35:36 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 29 05:35:36 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 29 05:35:36 localhost kernel: ... version:                0
Nov 29 05:35:36 localhost kernel: ... bit width:              48
Nov 29 05:35:36 localhost kernel: ... generic registers:      6
Nov 29 05:35:36 localhost kernel: ... value mask:             0000ffffffffffff
Nov 29 05:35:36 localhost kernel: ... max period:             00007fffffffffff
Nov 29 05:35:36 localhost kernel: ... fixed-purpose events:   0
Nov 29 05:35:36 localhost kernel: ... event mask:             000000000000003f
Nov 29 05:35:36 localhost kernel: signal: max sigframe size: 1776
Nov 29 05:35:36 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 29 05:35:36 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 29 05:35:36 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 29 05:35:36 localhost kernel: smpboot: x86: Booting SMP configuration:
Nov 29 05:35:36 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 29 05:35:36 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 29 05:35:36 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 29 05:35:36 localhost kernel: node 0 deferred pages initialised in 6ms
Nov 29 05:35:36 localhost kernel: Memory: 7766056K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616276K reserved, 0K cma-reserved)
Nov 29 05:35:36 localhost kernel: devtmpfs: initialized
Nov 29 05:35:36 localhost kernel: x86/mm: Memory block size: 128MB
Nov 29 05:35:36 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 29 05:35:36 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 29 05:35:36 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 29 05:35:36 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 29 05:35:36 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 29 05:35:36 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 29 05:35:36 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 29 05:35:36 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 29 05:35:36 localhost kernel: audit: type=2000 audit(1764394534.880:1): state=initialized audit_enabled=0 res=1
Nov 29 05:35:36 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 29 05:35:36 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 29 05:35:36 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 29 05:35:36 localhost kernel: cpuidle: using governor menu
Nov 29 05:35:36 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 29 05:35:36 localhost kernel: PCI: Using configuration type 1 for base access
Nov 29 05:35:36 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 29 05:35:36 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 29 05:35:36 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 29 05:35:36 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 29 05:35:36 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 29 05:35:36 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 29 05:35:36 localhost kernel: Demotion targets for Node 0: null
Nov 29 05:35:36 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 29 05:35:36 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 29 05:35:36 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 29 05:35:36 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 29 05:35:36 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 29 05:35:36 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 29 05:35:36 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 29 05:35:36 localhost kernel: ACPI: Interpreter enabled
Nov 29 05:35:36 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 29 05:35:36 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 29 05:35:36 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 29 05:35:36 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 29 05:35:36 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 29 05:35:36 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 29 05:35:36 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [3] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [4] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [5] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [6] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [7] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [8] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [9] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [10] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [11] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [12] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [13] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [14] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [15] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [16] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [17] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [18] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [19] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [20] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [21] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [22] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [23] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [24] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [25] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [26] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [27] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [28] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [29] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [30] registered
Nov 29 05:35:36 localhost kernel: acpiphp: Slot [31] registered
Nov 29 05:35:36 localhost kernel: PCI host bridge to bus 0000:00
Nov 29 05:35:36 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 29 05:35:36 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 29 05:35:36 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 29 05:35:36 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 29 05:35:36 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 29 05:35:36 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 29 05:35:36 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 29 05:35:36 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 29 05:35:36 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 29 05:35:36 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 29 05:35:36 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 29 05:35:36 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 29 05:35:36 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 29 05:35:36 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 05:35:36 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 29 05:35:36 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 29 05:35:36 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 29 05:35:36 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 29 05:35:36 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 29 05:35:36 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 29 05:35:36 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 29 05:35:36 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 29 05:35:36 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 05:35:36 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 29 05:35:36 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 29 05:35:36 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 05:35:36 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 29 05:35:36 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 29 05:35:36 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 29 05:35:36 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 29 05:35:36 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 29 05:35:36 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 29 05:35:36 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 29 05:35:36 localhost kernel: iommu: Default domain type: Translated
Nov 29 05:35:36 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 29 05:35:36 localhost kernel: SCSI subsystem initialized
Nov 29 05:35:36 localhost kernel: ACPI: bus type USB registered
Nov 29 05:35:36 localhost kernel: usbcore: registered new interface driver usbfs
Nov 29 05:35:36 localhost kernel: usbcore: registered new interface driver hub
Nov 29 05:35:36 localhost kernel: usbcore: registered new device driver usb
Nov 29 05:35:36 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 29 05:35:36 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 29 05:35:36 localhost kernel: PTP clock support registered
Nov 29 05:35:36 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 29 05:35:36 localhost kernel: NetLabel: Initializing
Nov 29 05:35:36 localhost kernel: NetLabel:  domain hash size = 128
Nov 29 05:35:36 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 29 05:35:36 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 29 05:35:36 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 29 05:35:36 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 29 05:35:36 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 29 05:35:36 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 29 05:35:36 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 29 05:35:36 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 29 05:35:36 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 29 05:35:36 localhost kernel: vgaarb: loaded
Nov 29 05:35:36 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 29 05:35:36 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 29 05:35:36 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 29 05:35:36 localhost kernel: pnp: PnP ACPI init
Nov 29 05:35:36 localhost kernel: pnp 00:03: [dma 2]
Nov 29 05:35:36 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 29 05:35:36 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 29 05:35:36 localhost kernel: NET: Registered PF_INET protocol family
Nov 29 05:35:36 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 29 05:35:36 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 29 05:35:36 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 29 05:35:36 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 29 05:35:36 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 29 05:35:36 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 29 05:35:36 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 29 05:35:36 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 05:35:36 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 05:35:36 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 29 05:35:36 localhost kernel: NET: Registered PF_XDP protocol family
Nov 29 05:35:36 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 29 05:35:36 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 29 05:35:36 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 29 05:35:36 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 29 05:35:36 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 29 05:35:36 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 29 05:35:36 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 29 05:35:36 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 72424 usecs
Nov 29 05:35:36 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 29 05:35:36 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 29 05:35:36 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 29 05:35:36 localhost kernel: ACPI: bus type thunderbolt registered
Nov 29 05:35:36 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 29 05:35:36 localhost kernel: Initialise system trusted keyrings
Nov 29 05:35:36 localhost kernel: Key type blacklist registered
Nov 29 05:35:36 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 29 05:35:36 localhost kernel: zbud: loaded
Nov 29 05:35:36 localhost kernel: integrity: Platform Keyring initialized
Nov 29 05:35:36 localhost kernel: integrity: Machine keyring initialized
Nov 29 05:35:36 localhost kernel: Freeing initrd memory: 85868K
Nov 29 05:35:36 localhost kernel: NET: Registered PF_ALG protocol family
Nov 29 05:35:36 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 29 05:35:36 localhost kernel: Key type asymmetric registered
Nov 29 05:35:36 localhost kernel: Asymmetric key parser 'x509' registered
Nov 29 05:35:36 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 29 05:35:36 localhost kernel: io scheduler mq-deadline registered
Nov 29 05:35:36 localhost kernel: io scheduler kyber registered
Nov 29 05:35:36 localhost kernel: io scheduler bfq registered
Nov 29 05:35:36 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 29 05:35:36 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 29 05:35:36 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 29 05:35:36 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 29 05:35:36 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 29 05:35:36 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 29 05:35:36 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 29 05:35:36 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 29 05:35:36 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 29 05:35:36 localhost kernel: Non-volatile memory driver v1.3
Nov 29 05:35:36 localhost kernel: rdac: device handler registered
Nov 29 05:35:36 localhost kernel: hp_sw: device handler registered
Nov 29 05:35:36 localhost kernel: emc: device handler registered
Nov 29 05:35:36 localhost kernel: alua: device handler registered
Nov 29 05:35:36 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 29 05:35:36 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 29 05:35:36 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 29 05:35:36 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 29 05:35:36 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 29 05:35:36 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 29 05:35:36 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 29 05:35:36 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 29 05:35:36 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 29 05:35:36 localhost kernel: hub 1-0:1.0: USB hub found
Nov 29 05:35:36 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 29 05:35:36 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 29 05:35:36 localhost kernel: usbserial: USB Serial support registered for generic
Nov 29 05:35:36 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 29 05:35:36 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 29 05:35:36 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 29 05:35:36 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 29 05:35:36 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 29 05:35:36 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 29 05:35:36 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-29T05:35:35 UTC (1764394535)
Nov 29 05:35:36 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 29 05:35:36 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 29 05:35:36 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 29 05:35:36 localhost kernel: usbcore: registered new interface driver usbhid
Nov 29 05:35:36 localhost kernel: usbhid: USB HID core driver
Nov 29 05:35:36 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 29 05:35:36 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 29 05:35:36 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 29 05:35:36 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 29 05:35:36 localhost kernel: Initializing XFRM netlink socket
Nov 29 05:35:36 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 29 05:35:36 localhost kernel: Segment Routing with IPv6
Nov 29 05:35:36 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 29 05:35:36 localhost kernel: mpls_gso: MPLS GSO support
Nov 29 05:35:36 localhost kernel: IPI shorthand broadcast: enabled
Nov 29 05:35:36 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 29 05:35:36 localhost kernel: AES CTR mode by8 optimization enabled
Nov 29 05:35:36 localhost kernel: sched_clock: Marking stable (1193005233, 147104155)->(1454713589, -114604201)
Nov 29 05:35:36 localhost kernel: registered taskstats version 1
Nov 29 05:35:36 localhost kernel: Loading compiled-in X.509 certificates
Nov 29 05:35:36 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 05:35:36 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 29 05:35:36 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 29 05:35:36 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 29 05:35:36 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 29 05:35:36 localhost kernel: Demotion targets for Node 0: null
Nov 29 05:35:36 localhost kernel: page_owner is disabled
Nov 29 05:35:36 localhost kernel: Key type .fscrypt registered
Nov 29 05:35:36 localhost kernel: Key type fscrypt-provisioning registered
Nov 29 05:35:36 localhost kernel: Key type big_key registered
Nov 29 05:35:36 localhost kernel: Key type encrypted registered
Nov 29 05:35:36 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 29 05:35:36 localhost kernel: Loading compiled-in module X.509 certificates
Nov 29 05:35:36 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 05:35:36 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 29 05:35:36 localhost kernel: ima: No architecture policies found
Nov 29 05:35:36 localhost kernel: evm: Initialising EVM extended attributes:
Nov 29 05:35:36 localhost kernel: evm: security.selinux
Nov 29 05:35:36 localhost kernel: evm: security.SMACK64 (disabled)
Nov 29 05:35:36 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 29 05:35:36 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 29 05:35:36 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 29 05:35:36 localhost kernel: evm: security.apparmor (disabled)
Nov 29 05:35:36 localhost kernel: evm: security.ima
Nov 29 05:35:36 localhost kernel: evm: security.capability
Nov 29 05:35:36 localhost kernel: evm: HMAC attrs: 0x1
Nov 29 05:35:36 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 29 05:35:36 localhost kernel: Running certificate verification RSA selftest
Nov 29 05:35:36 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 29 05:35:36 localhost kernel: Running certificate verification ECDSA selftest
Nov 29 05:35:36 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 29 05:35:36 localhost kernel: clk: Disabling unused clocks
Nov 29 05:35:36 localhost kernel: Freeing unused decrypted memory: 2028K
Nov 29 05:35:36 localhost kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 29 05:35:36 localhost kernel: Write protecting the kernel read-only data: 30720k
Nov 29 05:35:36 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 29 05:35:36 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 29 05:35:36 localhost kernel: Run /init as init process
Nov 29 05:35:36 localhost kernel:   with arguments:
Nov 29 05:35:36 localhost kernel:     /init
Nov 29 05:35:36 localhost kernel:   with environment:
Nov 29 05:35:36 localhost kernel:     HOME=/
Nov 29 05:35:36 localhost kernel:     TERM=linux
Nov 29 05:35:36 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64
Nov 29 05:35:36 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 05:35:36 localhost systemd[1]: Detected virtualization kvm.
Nov 29 05:35:36 localhost systemd[1]: Detected architecture x86-64.
Nov 29 05:35:36 localhost systemd[1]: Running in initrd.
Nov 29 05:35:36 localhost systemd[1]: No hostname configured, using default hostname.
Nov 29 05:35:36 localhost systemd[1]: Hostname set to <localhost>.
Nov 29 05:35:36 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 29 05:35:36 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 29 05:35:36 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 29 05:35:36 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 29 05:35:36 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 29 05:35:36 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 29 05:35:36 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 29 05:35:36 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 29 05:35:36 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 29 05:35:36 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 05:35:36 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 29 05:35:36 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 29 05:35:36 localhost systemd[1]: Reached target Local File Systems.
Nov 29 05:35:36 localhost systemd[1]: Reached target Path Units.
Nov 29 05:35:36 localhost systemd[1]: Reached target Slice Units.
Nov 29 05:35:36 localhost systemd[1]: Reached target Swaps.
Nov 29 05:35:36 localhost systemd[1]: Reached target Timer Units.
Nov 29 05:35:36 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 29 05:35:36 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 29 05:35:36 localhost systemd[1]: Listening on Journal Socket.
Nov 29 05:35:36 localhost systemd[1]: Listening on udev Control Socket.
Nov 29 05:35:36 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 29 05:35:36 localhost systemd[1]: Reached target Socket Units.
Nov 29 05:35:36 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 29 05:35:36 localhost systemd[1]: Starting Journal Service...
Nov 29 05:35:36 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 05:35:36 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 29 05:35:36 localhost systemd[1]: Starting Create System Users...
Nov 29 05:35:36 localhost systemd[1]: Starting Setup Virtual Console...
Nov 29 05:35:36 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 29 05:35:36 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 29 05:35:36 localhost systemd[1]: Finished Create System Users.
Nov 29 05:35:36 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 05:35:36 localhost systemd-journald[303]: Journal started
Nov 29 05:35:36 localhost systemd-journald[303]: Runtime Journal (/run/log/journal/6289b14c9d0e4084a8992566f6eb59ac) is 8.0M, max 153.6M, 145.6M free.
Nov 29 05:35:36 localhost systemd-sysusers[307]: Creating group 'users' with GID 100.
Nov 29 05:35:36 localhost systemd-sysusers[307]: Creating group 'dbus' with GID 81.
Nov 29 05:35:36 localhost systemd-sysusers[307]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 29 05:35:36 localhost systemd[1]: Started Journal Service.
Nov 29 05:35:36 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 05:35:36 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 05:35:36 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 05:35:36 localhost systemd[1]: Finished Setup Virtual Console.
Nov 29 05:35:36 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 29 05:35:36 localhost systemd[1]: Starting dracut cmdline hook...
Nov 29 05:35:36 localhost dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Nov 29 05:35:36 localhost dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 05:35:36 localhost systemd[1]: Finished dracut cmdline hook.
Nov 29 05:35:36 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 29 05:35:36 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 29 05:35:36 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 29 05:35:36 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 29 05:35:36 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 29 05:35:36 localhost kernel: RPC: Registered udp transport module.
Nov 29 05:35:36 localhost kernel: RPC: Registered tcp transport module.
Nov 29 05:35:36 localhost kernel: RPC: Registered tcp-with-tls transport module.
Nov 29 05:35:36 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 29 05:35:36 localhost rpc.statd[442]: Version 2.5.4 starting
Nov 29 05:35:36 localhost rpc.statd[442]: Initializing NSM state
Nov 29 05:35:36 localhost rpc.idmapd[447]: Setting log level to 0
Nov 29 05:35:36 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 29 05:35:36 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 05:35:36 localhost systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 05:35:37 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 05:35:37 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 29 05:35:37 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 29 05:35:37 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 29 05:35:37 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 29 05:35:37 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 29 05:35:37 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 29 05:35:37 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 05:35:37 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 29 05:35:37 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 29 05:35:37 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 05:35:37 localhost systemd[1]: Reached target Network.
Nov 29 05:35:37 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 05:35:37 localhost systemd[1]: Starting dracut initqueue hook...
Nov 29 05:35:37 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 29 05:35:37 localhost systemd[1]: Reached target System Initialization.
Nov 29 05:35:37 localhost systemd[1]: Reached target Basic System.
Nov 29 05:35:37 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 29 05:35:37 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 29 05:35:37 localhost kernel: libata version 3.00 loaded.
Nov 29 05:35:37 localhost kernel:  vda: vda1
Nov 29 05:35:37 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 29 05:35:37 localhost systemd-udevd[480]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 05:35:37 localhost kernel: scsi host0: ata_piix
Nov 29 05:35:37 localhost kernel: scsi host1: ata_piix
Nov 29 05:35:37 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 29 05:35:37 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 29 05:35:37 localhost systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 05:35:37 localhost systemd[1]: Reached target Initrd Root Device.
Nov 29 05:35:37 localhost kernel: ata1: found unknown device (class 0)
Nov 29 05:35:37 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 29 05:35:37 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 29 05:35:37 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 29 05:35:37 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 29 05:35:37 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 29 05:35:37 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 29 05:35:37 localhost systemd[1]: Finished dracut initqueue hook.
Nov 29 05:35:37 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 05:35:37 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 29 05:35:37 localhost systemd[1]: Reached target Remote File Systems.
Nov 29 05:35:37 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 29 05:35:37 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 29 05:35:37 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Nov 29 05:35:37 localhost systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Nov 29 05:35:37 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 05:35:37 localhost systemd[1]: Mounting /sysroot...
Nov 29 05:35:38 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 29 05:35:38 localhost kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Nov 29 05:35:38 localhost kernel: XFS (vda1): Ending clean mount
Nov 29 05:35:38 localhost systemd[1]: Mounted /sysroot.
Nov 29 05:35:38 localhost systemd[1]: Reached target Initrd Root File System.
Nov 29 05:35:38 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 29 05:35:38 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 29 05:35:38 localhost systemd[1]: Reached target Initrd File Systems.
Nov 29 05:35:38 localhost systemd[1]: Reached target Initrd Default Target.
Nov 29 05:35:38 localhost systemd[1]: Starting dracut mount hook...
Nov 29 05:35:38 localhost systemd[1]: Finished dracut mount hook.
Nov 29 05:35:38 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 29 05:35:38 localhost rpc.idmapd[447]: exiting on signal 15
Nov 29 05:35:38 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 29 05:35:38 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 29 05:35:38 localhost systemd[1]: Stopped target Network.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Timer Units.
Nov 29 05:35:38 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 29 05:35:38 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Basic System.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Path Units.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Remote File Systems.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Slice Units.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Socket Units.
Nov 29 05:35:38 localhost systemd[1]: Stopped target System Initialization.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Local File Systems.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Swaps.
Nov 29 05:35:38 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped dracut mount hook.
Nov 29 05:35:38 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 29 05:35:38 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 29 05:35:38 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 29 05:35:38 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 29 05:35:38 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 29 05:35:38 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 29 05:35:38 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 29 05:35:38 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 29 05:35:38 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 29 05:35:38 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 29 05:35:38 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 29 05:35:38 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 29 05:35:38 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Closed udev Control Socket.
Nov 29 05:35:38 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Closed udev Kernel Socket.
Nov 29 05:35:38 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 29 05:35:38 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 29 05:35:38 localhost systemd[1]: Starting Cleanup udev Database...
Nov 29 05:35:38 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 29 05:35:38 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 29 05:35:38 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Stopped Create System Users.
Nov 29 05:35:38 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 29 05:35:38 localhost systemd[1]: Finished Cleanup udev Database.
Nov 29 05:35:38 localhost systemd[1]: Reached target Switch Root.
Nov 29 05:35:38 localhost systemd[1]: Starting Switch Root...
Nov 29 05:35:38 localhost systemd[1]: Switching root.
Nov 29 05:35:38 localhost systemd-journald[303]: Journal stopped
Nov 29 05:35:39 localhost systemd-journald[303]: Received SIGTERM from PID 1 (systemd).
Nov 29 05:35:39 localhost kernel: audit: type=1404 audit(1764394539.051:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 29 05:35:39 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:35:39 localhost kernel: SELinux:  policy capability open_perms=1
Nov 29 05:35:39 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:35:39 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:35:39 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:35:39 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:35:39 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:35:39 localhost kernel: audit: type=1403 audit(1764394539.178:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 29 05:35:39 localhost systemd[1]: Successfully loaded SELinux policy in 129.644ms.
Nov 29 05:35:39 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.293ms.
Nov 29 05:35:39 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 05:35:39 localhost systemd[1]: Detected virtualization kvm.
Nov 29 05:35:39 localhost systemd[1]: Detected architecture x86-64.
Nov 29 05:35:39 localhost systemd-rc-local-generator[636]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 05:35:39 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 29 05:35:39 localhost systemd[1]: Stopped Switch Root.
Nov 29 05:35:39 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 29 05:35:39 localhost systemd[1]: Created slice Slice /system/getty.
Nov 29 05:35:39 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 29 05:35:39 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 29 05:35:39 localhost systemd[1]: Created slice User and Session Slice.
Nov 29 05:35:39 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 05:35:39 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 29 05:35:39 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 29 05:35:39 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 29 05:35:39 localhost systemd[1]: Stopped target Switch Root.
Nov 29 05:35:39 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 29 05:35:39 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 29 05:35:39 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 29 05:35:39 localhost systemd[1]: Reached target Path Units.
Nov 29 05:35:39 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 29 05:35:39 localhost systemd[1]: Reached target Slice Units.
Nov 29 05:35:39 localhost systemd[1]: Reached target Swaps.
Nov 29 05:35:39 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 29 05:35:39 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 29 05:35:39 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 29 05:35:39 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 29 05:35:39 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 29 05:35:39 localhost systemd[1]: Listening on udev Control Socket.
Nov 29 05:35:39 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 29 05:35:39 localhost systemd[1]: Mounting Huge Pages File System...
Nov 29 05:35:39 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 29 05:35:39 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 29 05:35:39 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 29 05:35:39 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 05:35:39 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 29 05:35:39 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 29 05:35:39 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 29 05:35:39 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Nov 29 05:35:39 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 29 05:35:39 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 29 05:35:39 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 29 05:35:39 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 29 05:35:39 localhost systemd[1]: Stopped Journal Service.
Nov 29 05:35:39 localhost systemd[1]: Starting Journal Service...
Nov 29 05:35:39 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 05:35:39 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 29 05:35:39 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 05:35:39 localhost systemd-journald[677]: Journal started
Nov 29 05:35:39 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 05:35:39 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 29 05:35:39 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 29 05:35:39 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 29 05:35:39 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 29 05:35:39 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 29 05:35:39 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 29 05:35:39 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 29 05:35:39 localhost systemd[1]: Started Journal Service.
Nov 29 05:35:39 localhost kernel: ACPI: bus type drm_connector registered
Nov 29 05:35:39 localhost kernel: fuse: init (API version 7.37)
Nov 29 05:35:39 localhost systemd[1]: Mounted Huge Pages File System.
Nov 29 05:35:39 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 29 05:35:39 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 29 05:35:39 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 29 05:35:39 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 29 05:35:39 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 05:35:39 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 29 05:35:39 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 29 05:35:39 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 29 05:35:39 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 29 05:35:39 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 29 05:35:39 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 29 05:35:39 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 29 05:35:39 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 29 05:35:39 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 29 05:35:39 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 29 05:35:39 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 29 05:35:39 localhost systemd[1]: Mounting FUSE Control File System...
Nov 29 05:35:39 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 05:35:39 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 29 05:35:39 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 29 05:35:39 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 29 05:35:39 localhost systemd[1]: Starting Load/Save OS Random Seed...
Nov 29 05:35:39 localhost systemd[1]: Starting Create System Users...
Nov 29 05:35:39 localhost systemd[1]: Mounted FUSE Control File System.
Nov 29 05:35:39 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 05:35:39 localhost systemd-journald[677]: Received client request to flush runtime journal.
Nov 29 05:35:39 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 29 05:35:39 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 29 05:35:39 localhost systemd[1]: Finished Create System Users.
Nov 29 05:35:39 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 05:35:39 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 05:35:39 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 29 05:35:39 localhost systemd[1]: Reached target Local File Systems.
Nov 29 05:35:39 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 29 05:35:39 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 29 05:35:39 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 29 05:35:39 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 29 05:35:39 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 05:35:40 localhost bootctl[694]: Couldn't find EFI system partition, skipping.
Nov 29 05:35:40 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 29 05:35:40 localhost systemd[1]: Finished Load/Save OS Random Seed.
Nov 29 05:35:40 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 05:35:40 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 29 05:35:40 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 29 05:35:40 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 05:35:40 localhost systemd[1]: Starting Security Auditing Service...
Nov 29 05:35:40 localhost systemd[1]: Starting RPC Bind...
Nov 29 05:35:40 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 29 05:35:40 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 29 05:35:40 localhost auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 29 05:35:40 localhost auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 29 05:35:40 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 29 05:35:40 localhost systemd[1]: Started RPC Bind.
Nov 29 05:35:40 localhost augenrules[706]: /sbin/augenrules: No change
Nov 29 05:35:40 localhost augenrules[721]: No rules
Nov 29 05:35:40 localhost augenrules[721]: enabled 1
Nov 29 05:35:40 localhost augenrules[721]: failure 1
Nov 29 05:35:40 localhost augenrules[721]: pid 701
Nov 29 05:35:40 localhost augenrules[721]: rate_limit 0
Nov 29 05:35:40 localhost augenrules[721]: backlog_limit 8192
Nov 29 05:35:40 localhost augenrules[721]: lost 0
Nov 29 05:35:40 localhost augenrules[721]: backlog 4
Nov 29 05:35:40 localhost augenrules[721]: backlog_wait_time 60000
Nov 29 05:35:40 localhost augenrules[721]: backlog_wait_time_actual 0
Nov 29 05:35:40 localhost augenrules[721]: enabled 1
Nov 29 05:35:40 localhost augenrules[721]: failure 1
Nov 29 05:35:40 localhost augenrules[721]: pid 701
Nov 29 05:35:40 localhost augenrules[721]: rate_limit 0
Nov 29 05:35:40 localhost augenrules[721]: backlog_limit 8192
Nov 29 05:35:40 localhost augenrules[721]: lost 0
Nov 29 05:35:40 localhost augenrules[721]: backlog 4
Nov 29 05:35:40 localhost augenrules[721]: backlog_wait_time 60000
Nov 29 05:35:40 localhost augenrules[721]: backlog_wait_time_actual 0
Nov 29 05:35:40 localhost augenrules[721]: enabled 1
Nov 29 05:35:40 localhost augenrules[721]: failure 1
Nov 29 05:35:40 localhost augenrules[721]: pid 701
Nov 29 05:35:40 localhost augenrules[721]: rate_limit 0
Nov 29 05:35:40 localhost augenrules[721]: backlog_limit 8192
Nov 29 05:35:40 localhost augenrules[721]: lost 0
Nov 29 05:35:40 localhost augenrules[721]: backlog 4
Nov 29 05:35:40 localhost augenrules[721]: backlog_wait_time 60000
Nov 29 05:35:40 localhost augenrules[721]: backlog_wait_time_actual 0
Nov 29 05:35:40 localhost systemd[1]: Started Security Auditing Service.
Nov 29 05:35:40 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 29 05:35:40 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 29 05:35:40 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 29 05:35:40 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 05:35:40 localhost systemd[1]: Starting Update is Completed...
Nov 29 05:35:40 localhost systemd[1]: Finished Update is Completed.
Nov 29 05:35:40 localhost systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 05:35:40 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 05:35:40 localhost systemd[1]: Reached target System Initialization.
Nov 29 05:35:40 localhost systemd[1]: Started dnf makecache --timer.
Nov 29 05:35:40 localhost systemd[1]: Started Daily rotation of log files.
Nov 29 05:35:40 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 29 05:35:40 localhost systemd[1]: Reached target Timer Units.
Nov 29 05:35:40 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 29 05:35:40 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 29 05:35:40 localhost systemd[1]: Reached target Socket Units.
Nov 29 05:35:40 localhost systemd-udevd[737]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 05:35:40 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 29 05:35:40 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 05:35:40 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 29 05:35:40 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 29 05:35:40 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 05:35:40 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 29 05:35:40 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 29 05:35:40 localhost systemd[1]: Reached target Basic System.
Nov 29 05:35:40 localhost dbus-broker-lau[768]: Ready
Nov 29 05:35:40 localhost systemd[1]: Starting NTP client/server...
Nov 29 05:35:40 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 29 05:35:40 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 29 05:35:40 localhost systemd[1]: Starting IPv4 firewall with iptables...
Nov 29 05:35:40 localhost systemd[1]: Started irqbalance daemon.
Nov 29 05:35:40 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 29 05:35:40 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 29 05:35:40 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 29 05:35:40 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 29 05:35:40 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 29 05:35:40 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 05:35:40 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 05:35:40 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 05:35:40 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 29 05:35:40 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 29 05:35:40 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 29 05:35:40 localhost systemd[1]: Starting User Login Management...
Nov 29 05:35:40 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 29 05:35:40 localhost chronyd[792]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 05:35:40 localhost chronyd[792]: Loaded 0 symmetric keys
Nov 29 05:35:40 localhost chronyd[792]: Using right/UTC timezone to obtain leap second data
Nov 29 05:35:40 localhost chronyd[792]: Loaded seccomp filter (level 2)
Nov 29 05:35:40 localhost systemd[1]: Started NTP client/server.
Nov 29 05:35:40 localhost systemd-logind[785]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 05:35:40 localhost systemd-logind[785]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 05:35:40 localhost systemd-logind[785]: New seat seat0.
Nov 29 05:35:40 localhost systemd[1]: Started User Login Management.
Nov 29 05:35:40 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 29 05:35:41 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 29 05:35:41 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 29 05:35:41 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 29 05:35:41 localhost kernel: Console: switching to colour dummy device 80x25
Nov 29 05:35:41 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 29 05:35:41 localhost kernel: [drm] features: -context_init
Nov 29 05:35:41 localhost kernel: [drm] number of scanouts: 1
Nov 29 05:35:41 localhost kernel: [drm] number of cap sets: 0
Nov 29 05:35:41 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 29 05:35:41 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 29 05:35:41 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 29 05:35:41 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 29 05:35:41 localhost kernel: kvm_amd: TSC scaling supported
Nov 29 05:35:41 localhost kernel: kvm_amd: Nested Virtualization enabled
Nov 29 05:35:41 localhost kernel: kvm_amd: Nested Paging enabled
Nov 29 05:35:41 localhost kernel: kvm_amd: LBR virtualization supported
Nov 29 05:35:41 localhost iptables.init[777]: iptables: Applying firewall rules: [  OK  ]
Nov 29 05:35:41 localhost systemd[1]: Finished IPv4 firewall with iptables.
Nov 29 05:35:41 localhost cloud-init[838]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 29 Nov 2025 05:35:41 +0000. Up 6.91 seconds.
Nov 29 05:35:41 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 29 05:35:41 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 29 05:35:41 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpfly6n2hr.mount: Deactivated successfully.
Nov 29 05:35:41 localhost systemd[1]: Starting Hostname Service...
Nov 29 05:35:41 localhost systemd[1]: Started Hostname Service.
Nov 29 05:35:41 np0005539509.novalocal systemd-hostnamed[852]: Hostname set to <np0005539509.novalocal> (static)
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Reached target Preparation for Network.
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Starting Network Manager...
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.7796] NetworkManager (version 1.54.1-1.el9) is starting... (boot:ba0dbca2-f496-4536-953e-379bfe5fc9e9)
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.7801] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.7873] manager[0x55d11b224080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.7903] hostname: hostname: using hostnamed
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.7904] hostname: static hostname changed from (none) to "np0005539509.novalocal"
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.7907] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.7990] manager[0x55d11b224080]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.7991] manager[0x55d11b224080]: rfkill: WWAN hardware radio set enabled
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8024] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8025] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8025] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8026] manager: Networking is enabled by state file
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8027] settings: Loaded settings plugin: keyfile (internal)
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8034] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8050] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8060] dhcp: init: Using DHCP client 'internal'
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8062] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8072] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8078] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8084] device (lo): Activation: starting connection 'lo' (e88f289b-57af-451c-b662-f7b3e0248e91)
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8090] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8092] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8117] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8122] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8124] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8125] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8128] device (eth0): carrier: link connected
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8130] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8136] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8141] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8145] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8146] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8148] manager: NetworkManager state is now CONNECTING
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8149] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8155] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8159] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Started Network Manager.
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Reached target Network.
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8403] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8406] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 05:35:41 np0005539509.novalocal NetworkManager[856]: <info>  [1764394541.8413] device (lo): Activation: successful, device activated.
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Reached target NFS client services.
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: Reached target Remote File Systems.
Nov 29 05:35:41 np0005539509.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 05:35:42 np0005539509.novalocal NetworkManager[856]: <info>  [1764394542.9564] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 29 05:35:42 np0005539509.novalocal NetworkManager[856]: <info>  [1764394542.9578] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 05:35:42 np0005539509.novalocal NetworkManager[856]: <info>  [1764394542.9604] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 05:35:42 np0005539509.novalocal NetworkManager[856]: <info>  [1764394542.9642] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 05:35:42 np0005539509.novalocal NetworkManager[856]: <info>  [1764394542.9643] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 05:35:42 np0005539509.novalocal NetworkManager[856]: <info>  [1764394542.9646] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 05:35:42 np0005539509.novalocal NetworkManager[856]: <info>  [1764394542.9650] device (eth0): Activation: successful, device activated.
Nov 29 05:35:42 np0005539509.novalocal NetworkManager[856]: <info>  [1764394542.9655] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 05:35:42 np0005539509.novalocal NetworkManager[856]: <info>  [1764394542.9658] manager: startup complete
Nov 29 05:35:42 np0005539509.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 29 05:35:42 np0005539509.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 29 Nov 2025 05:35:43 +0000. Up 8.93 seconds.
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: |  eth0  | True |        38.102.83.204         | 255.255.255.0 | global | fa:16:3e:f4:b0:ce |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: |  eth0  | True | fe80::f816:3eff:fef4:b0ce/64 |       .       |  link  | fa:16:3e:f4:b0:ce |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 29 05:35:43 np0005539509.novalocal cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 05:35:47 np0005539509.novalocal chronyd[792]: Selected source 162.159.200.123 (2.centos.pool.ntp.org)
Nov 29 05:35:47 np0005539509.novalocal chronyd[792]: System clock wrong by 1.179441 seconds
Nov 29 05:35:49 np0005539509.novalocal chronyd[792]: System clock was stepped by 1.179441 seconds
Nov 29 05:35:49 np0005539509.novalocal chronyd[792]: System clock TAI offset set to 37 seconds
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: Cannot change IRQ 35 affinity: Operation not permitted
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: IRQ 35 affinity is now unmanaged
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: IRQ 25 affinity is now unmanaged
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: IRQ 31 affinity is now unmanaged
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: IRQ 26 affinity is now unmanaged
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: IRQ 30 affinity is now unmanaged
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 29 05:35:52 np0005539509.novalocal irqbalance[781]: IRQ 29 affinity is now unmanaged
Nov 29 05:35:54 np0005539509.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:36:13 np0005539509.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 05:37:22 np0005539509.novalocal useradd[988]: new group: name=cloud-user, GID=1001
Nov 29 05:37:22 np0005539509.novalocal useradd[988]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 29 05:37:22 np0005539509.novalocal useradd[988]: add 'cloud-user' to group 'adm'
Nov 29 05:37:22 np0005539509.novalocal useradd[988]: add 'cloud-user' to group 'systemd-journal'
Nov 29 05:37:22 np0005539509.novalocal useradd[988]: add 'cloud-user' to shadow group 'adm'
Nov 29 05:37:22 np0005539509.novalocal useradd[988]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: Generating public/private rsa key pair.
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: The key fingerprint is:
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: SHA256:zHsTTn9kXEJ/sx6L0lH9r9QVsWcWY+VXHpZ+XptPw2E root@np0005539509.novalocal
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: The key's randomart image is:
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: +---[RSA 3072]----+
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |              .B=|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |             .o=O|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |              o=%|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |       o     ..E@|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |        S o  .*+O|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |         + o.o+*B|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |        . +..oo+=|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |         . ..o ..|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |              .  |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: +----[SHA256]-----+
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: Generating public/private ecdsa key pair.
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: The key fingerprint is:
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: SHA256:hrB3mjrdZLTcMKbz+gfWCLe38DVhYocwMJUD7MjsnkY root@np0005539509.novalocal
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: The key's randomart image is:
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: +---[ECDSA 256]---+
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |     .++..       |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |      ..=        |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |   o.o   + .     |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |    +oo.* + +    |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |   .. oBSX + .   |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |    E.o=@ + o    |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |   o ooB = o .   |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |    =.. o +      |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |   ....o..       |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: +----[SHA256]-----+
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: Generating public/private ed25519 key pair.
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: The key fingerprint is:
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: SHA256:Trsar2Pf1VIfL+mFrnewRWxstKDdw89MSvBE4PGb0GE root@np0005539509.novalocal
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: The key's randomart image is:
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: +--[ED25519 256]--+
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |            ooE  |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |           ..*...|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |            =+*+.|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |           . ooBB|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |        S    .+Xo|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |       o .   oo+B|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |      . o   o =++|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |      oo o . +oo.|
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: |     .o=+ . .oo. |
Nov 29 05:37:24 np0005539509.novalocal cloud-init[919]: +----[SHA256]-----+
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Reached target Network is Online.
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Starting System Logging Service...
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 29 05:37:24 np0005539509.novalocal sm-notify[1005]: Version 2.5.4 starting
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Starting Permit User Sessions...
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 29 05:37:24 np0005539509.novalocal sshd[1007]: Server listening on 0.0.0.0 port 22.
Nov 29 05:37:24 np0005539509.novalocal sshd[1007]: Server listening on :: port 22.
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Finished Permit User Sessions.
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Started Command Scheduler.
Nov 29 05:37:24 np0005539509.novalocal rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Nov 29 05:37:24 np0005539509.novalocal rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Started Getty on tty1.
Nov 29 05:37:24 np0005539509.novalocal crond[1011]: (CRON) STARTUP (1.5.7)
Nov 29 05:37:24 np0005539509.novalocal crond[1011]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 29 05:37:24 np0005539509.novalocal crond[1011]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 47% if used.)
Nov 29 05:37:24 np0005539509.novalocal crond[1011]: (CRON) INFO (running with inotify support)
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Reached target Login Prompts.
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Started System Logging Service.
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Reached target Multi-User System.
Nov 29 05:37:24 np0005539509.novalocal sshd-session[1010]: Connection reset by 38.102.83.114 port 50518 [preauth]
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 29 05:37:24 np0005539509.novalocal sshd-session[1021]: Unable to negotiate with 38.102.83.114 port 50532: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 29 05:37:24 np0005539509.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 29 05:37:24 np0005539509.novalocal sshd-session[1035]: Unable to negotiate with 38.102.83.114 port 50548: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 29 05:37:24 np0005539509.novalocal rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 05:37:24 np0005539509.novalocal sshd-session[1042]: Unable to negotiate with 38.102.83.114 port 50552: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 29 05:37:24 np0005539509.novalocal sshd-session[1066]: Connection reset by 38.102.83.114 port 50566 [preauth]
Nov 29 05:37:24 np0005539509.novalocal sshd-session[1029]: Connection closed by 38.102.83.114 port 50534 [preauth]
Nov 29 05:37:24 np0005539509.novalocal sshd-session[1077]: Unable to negotiate with 38.102.83.114 port 50572: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Nov 29 05:37:24 np0005539509.novalocal kdumpctl[1022]: kdump: No kdump initial ramdisk found.
Nov 29 05:37:24 np0005539509.novalocal kdumpctl[1022]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 29 05:37:24 np0005539509.novalocal sshd-session[1080]: Unable to negotiate with 38.102.83.114 port 50582: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 29 05:37:25 np0005539509.novalocal sshd-session[1052]: Connection closed by 38.102.83.114 port 50556 [preauth]
Nov 29 05:37:25 np0005539509.novalocal cloud-init[1119]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 29 Nov 2025 05:37:25 +0000. Up 109.49 seconds.
Nov 29 05:37:25 np0005539509.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Nov 29 05:37:25 np0005539509.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Nov 29 05:37:25 np0005539509.novalocal dracut[1286]: dracut-057-102.git20250818.el9
Nov 29 05:37:25 np0005539509.novalocal cloud-init[1302]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 29 Nov 2025 05:37:25 +0000. Up 109.91 seconds.
Nov 29 05:37:25 np0005539509.novalocal cloud-init[1304]: #############################################################
Nov 29 05:37:25 np0005539509.novalocal cloud-init[1305]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 29 05:37:25 np0005539509.novalocal cloud-init[1307]: 256 SHA256:hrB3mjrdZLTcMKbz+gfWCLe38DVhYocwMJUD7MjsnkY root@np0005539509.novalocal (ECDSA)
Nov 29 05:37:25 np0005539509.novalocal cloud-init[1309]: 256 SHA256:Trsar2Pf1VIfL+mFrnewRWxstKDdw89MSvBE4PGb0GE root@np0005539509.novalocal (ED25519)
Nov 29 05:37:25 np0005539509.novalocal cloud-init[1311]: 3072 SHA256:zHsTTn9kXEJ/sx6L0lH9r9QVsWcWY+VXHpZ+XptPw2E root@np0005539509.novalocal (RSA)
Nov 29 05:37:25 np0005539509.novalocal cloud-init[1312]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 29 05:37:25 np0005539509.novalocal cloud-init[1313]: #############################################################
Nov 29 05:37:25 np0005539509.novalocal cloud-init[1302]: Cloud-init v. 24.4-7.el9 finished at Sat, 29 Nov 2025 05:37:25 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 110.10 seconds
Nov 29 05:37:25 np0005539509.novalocal dracut[1288]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 29 05:37:25 np0005539509.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Nov 29 05:37:25 np0005539509.novalocal systemd[1]: Reached target Cloud-init target.
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: Module 'resume' will not be installed, because it's in the list to be omitted!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: memstrack is not available
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 05:37:26 np0005539509.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 05:37:27 np0005539509.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 05:37:27 np0005539509.novalocal dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 05:37:27 np0005539509.novalocal dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 05:37:27 np0005539509.novalocal dracut[1288]: memstrack is not available
Nov 29 05:37:27 np0005539509.novalocal dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 05:37:27 np0005539509.novalocal dracut[1288]: *** Including module: systemd ***
Nov 29 05:37:27 np0005539509.novalocal dracut[1288]: *** Including module: fips ***
Nov 29 05:37:27 np0005539509.novalocal dracut[1288]: *** Including module: systemd-initrd ***
Nov 29 05:37:27 np0005539509.novalocal dracut[1288]: *** Including module: i18n ***
Nov 29 05:37:27 np0005539509.novalocal dracut[1288]: *** Including module: drm ***
Nov 29 05:37:28 np0005539509.novalocal dracut[1288]: *** Including module: prefixdevname ***
Nov 29 05:37:28 np0005539509.novalocal dracut[1288]: *** Including module: kernel-modules ***
Nov 29 05:37:28 np0005539509.novalocal kernel: block vda: the capability attribute has been deprecated.
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]: *** Including module: kernel-modules-extra ***
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]: *** Including module: qemu ***
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]: *** Including module: fstab-sys ***
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]: *** Including module: rootfs-block ***
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]: *** Including module: terminfo ***
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]: *** Including module: udev-rules ***
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]: Skipping udev rule: 91-permissions.rules
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 29 05:37:29 np0005539509.novalocal dracut[1288]: *** Including module: virtiofs ***
Nov 29 05:37:30 np0005539509.novalocal dracut[1288]: *** Including module: dracut-systemd ***
Nov 29 05:37:30 np0005539509.novalocal dracut[1288]: *** Including module: usrmount ***
Nov 29 05:37:30 np0005539509.novalocal dracut[1288]: *** Including module: base ***
Nov 29 05:37:30 np0005539509.novalocal dracut[1288]: *** Including module: fs-lib ***
Nov 29 05:37:30 np0005539509.novalocal dracut[1288]: *** Including module: kdumpbase ***
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:   microcode_ctl module: mangling fw_dir
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: configuration "intel" is ignored
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]: *** Including module: openssl ***
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]: *** Including module: shutdown ***
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]: *** Including module: squash ***
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]: *** Including modules done ***
Nov 29 05:37:31 np0005539509.novalocal dracut[1288]: *** Installing kernel module dependencies ***
Nov 29 05:37:32 np0005539509.novalocal dracut[1288]: *** Installing kernel module dependencies done ***
Nov 29 05:37:32 np0005539509.novalocal dracut[1288]: *** Resolving executable dependencies ***
Nov 29 05:37:34 np0005539509.novalocal dracut[1288]: *** Resolving executable dependencies done ***
Nov 29 05:37:34 np0005539509.novalocal dracut[1288]: *** Generating early-microcode cpio image ***
Nov 29 05:37:34 np0005539509.novalocal dracut[1288]: *** Store current command line parameters ***
Nov 29 05:37:34 np0005539509.novalocal dracut[1288]: Stored kernel commandline:
Nov 29 05:37:34 np0005539509.novalocal dracut[1288]: No dracut internal kernel commandline stored in the initramfs
Nov 29 05:37:34 np0005539509.novalocal dracut[1288]: *** Install squash loader ***
Nov 29 05:37:35 np0005539509.novalocal dracut[1288]: *** Squashing the files inside the initramfs ***
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: *** Squashing the files inside the initramfs done ***
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: *** Hardlinking files ***
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: Mode:           real
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: Files:          50
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: Linked:         0 files
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: Compared:       0 xattrs
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: Compared:       0 files
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: Saved:          0 B
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: Duration:       0.000580 seconds
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: *** Hardlinking files done ***
Nov 29 05:37:36 np0005539509.novalocal dracut[1288]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 29 05:37:37 np0005539509.novalocal kdumpctl[1022]: kdump: kexec: loaded kdump kernel
Nov 29 05:37:37 np0005539509.novalocal kdumpctl[1022]: kdump: Starting kdump: [OK]
Nov 29 05:37:37 np0005539509.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 29 05:37:37 np0005539509.novalocal systemd[1]: Startup finished in 1.551s (kernel) + 3.149s (initrd) + 1min 57.285s (userspace) = 2min 1.986s.
Nov 29 05:37:40 np0005539509.novalocal sshd-session[4296]: Accepted publickey for zuul from 38.102.83.114 port 37944 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 29 05:37:40 np0005539509.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 29 05:37:40 np0005539509.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 29 05:37:40 np0005539509.novalocal systemd-logind[785]: New session 1 of user zuul.
Nov 29 05:37:40 np0005539509.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 29 05:37:40 np0005539509.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 29 05:37:40 np0005539509.novalocal systemd[4300]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Queued start job for default target Main User Target.
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Created slice User Application Slice.
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Reached target Paths.
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Reached target Timers.
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Starting D-Bus User Message Bus Socket...
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Starting Create User's Volatile Files and Directories...
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Finished Create User's Volatile Files and Directories.
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Listening on D-Bus User Message Bus Socket.
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Reached target Sockets.
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Reached target Basic System.
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Reached target Main User Target.
Nov 29 05:37:41 np0005539509.novalocal systemd[4300]: Startup finished in 186ms.
Nov 29 05:37:41 np0005539509.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 29 05:37:41 np0005539509.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 29 05:37:41 np0005539509.novalocal sshd-session[4296]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:37:41 np0005539509.novalocal python3[4383]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:37:45 np0005539509.novalocal python3[4411]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:37:52 np0005539509.novalocal python3[4469]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:37:53 np0005539509.novalocal python3[4509]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 29 05:37:55 np0005539509.novalocal python3[4535]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrxzXgpPmVv8+7+5w1Oy1RsXOPeqdxTcUlq37d0RcYulAAKXWla/qJwAX46v5xh/Mg4GnRpk77lvDWcVnOQjFYQg3OeLmFgDDNPV0YL7URmIe/MvgcqM+Kx7/SQjk+hEt7rUIqkFUjeREX60T5eTEMANFgJrljqZcBTMgYr67x4v7oFELzKuZIO0SCAprJ9NYmdRaC+DsjZjU+DuFdHBnfZCpgkTFMCda2FAS9BneAVOIMCBu5RgNVJXeAgIsPX9GNX3qDJMKOluQLOW++2gbue3S1Nrs1GMPm+IPRD4yWc9eZs1tpR1jdP1BEPBpyQRQlUn4z7BUdEogSzYiXCSmqzN1o/R3mdi16bG8e2lHve5MQFABPko8KsgVOJu0H7b7wGo/oGdXH7sdlKuGoWxWyTFcq3RcVkaVgjKtt6zeswkrpxMUv9/6NXPrhIWqdQm/wVw0Pv2p98yq10QRPyBv5yI8zcNjxueUl3aM8SZML87E6lhkUFFdAuVof+Sl5Pz8= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:37:55 np0005539509.novalocal python3[4559]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:37:56 np0005539509.novalocal python3[4658]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:37:56 np0005539509.novalocal python3[4729]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394676.1664808-252-165655715209557/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=601e897125784122ba5d7472ada57b1d_id_rsa follow=False checksum=5ac8bea8bfb8f348688bf24843ddb1285b2d351d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:37:57 np0005539509.novalocal python3[4852]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:37:57 np0005539509.novalocal python3[4923]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394677.1863534-307-11101954615071/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=601e897125784122ba5d7472ada57b1d_id_rsa.pub follow=False checksum=48b31d706687f3385690285b8caeaea67ea8286c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:37:59 np0005539509.novalocal python3[4971]: ansible-ping Invoked with data=pong
Nov 29 05:38:00 np0005539509.novalocal python3[4995]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:38:02 np0005539509.novalocal python3[5053]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 29 05:38:03 np0005539509.novalocal python3[5085]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:04 np0005539509.novalocal python3[5109]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:04 np0005539509.novalocal python3[5133]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:05 np0005539509.novalocal python3[5157]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:05 np0005539509.novalocal python3[5181]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:05 np0005539509.novalocal python3[5205]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:07 np0005539509.novalocal sudo[5229]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qldpwcjhargdqcqkbasqigbtrvmmetog ; /usr/bin/python3'
Nov 29 05:38:07 np0005539509.novalocal sudo[5229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:07 np0005539509.novalocal python3[5231]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:07 np0005539509.novalocal sudo[5229]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:07 np0005539509.novalocal sudo[5307]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmrnnanioihzfrppiuxutzyojefiluvr ; /usr/bin/python3'
Nov 29 05:38:07 np0005539509.novalocal sudo[5307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:08 np0005539509.novalocal python3[5309]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:38:08 np0005539509.novalocal sudo[5307]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:08 np0005539509.novalocal sudo[5380]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjnaaoqufjvsfrabtykrgypluxbjzvvz ; /usr/bin/python3'
Nov 29 05:38:08 np0005539509.novalocal sudo[5380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:08 np0005539509.novalocal python3[5382]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394687.580609-32-113287568176534/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:08 np0005539509.novalocal sudo[5380]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:09 np0005539509.novalocal python3[5430]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:09 np0005539509.novalocal python3[5454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:09 np0005539509.novalocal python3[5478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:10 np0005539509.novalocal python3[5502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:10 np0005539509.novalocal python3[5526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:10 np0005539509.novalocal python3[5550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:10 np0005539509.novalocal python3[5574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:11 np0005539509.novalocal python3[5598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:11 np0005539509.novalocal python3[5622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:11 np0005539509.novalocal python3[5646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:12 np0005539509.novalocal python3[5670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:12 np0005539509.novalocal python3[5694]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:12 np0005539509.novalocal python3[5718]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:13 np0005539509.novalocal python3[5742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:13 np0005539509.novalocal python3[5766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:13 np0005539509.novalocal python3[5790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:13 np0005539509.novalocal python3[5814]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:14 np0005539509.novalocal python3[5838]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:14 np0005539509.novalocal python3[5862]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:14 np0005539509.novalocal python3[5886]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:15 np0005539509.novalocal python3[5910]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:15 np0005539509.novalocal python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:15 np0005539509.novalocal python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:16 np0005539509.novalocal python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:16 np0005539509.novalocal python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:16 np0005539509.novalocal python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:38:18 np0005539509.novalocal sudo[6054]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgthguwpuaxqphdvemmjttczfejmtaoe ; /usr/bin/python3'
Nov 29 05:38:18 np0005539509.novalocal sudo[6054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:19 np0005539509.novalocal python3[6056]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 05:38:19 np0005539509.novalocal systemd[1]: Starting Time & Date Service...
Nov 29 05:38:19 np0005539509.novalocal systemd[1]: Started Time & Date Service.
Nov 29 05:38:19 np0005539509.novalocal systemd-timedated[6058]: Changed time zone to 'UTC' (UTC).
Nov 29 05:38:19 np0005539509.novalocal sudo[6054]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:19 np0005539509.novalocal sudo[6085]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esawioqyhokcbcbczityfiyrklfxzmcj ; /usr/bin/python3'
Nov 29 05:38:19 np0005539509.novalocal sudo[6085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:19 np0005539509.novalocal python3[6087]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:19 np0005539509.novalocal sudo[6085]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:20 np0005539509.novalocal python3[6163]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:38:20 np0005539509.novalocal python3[6234]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764394700.03186-252-178861774041832/source _original_basename=tmpydqc7bn4 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:21 np0005539509.novalocal python3[6334]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:38:21 np0005539509.novalocal python3[6405]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764394701.1508152-303-221719467209209/source _original_basename=tmpw95s2vyv follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:22 np0005539509.novalocal sudo[6505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvsnfnmiloakbkwwimqlkppuaggxqeva ; /usr/bin/python3'
Nov 29 05:38:22 np0005539509.novalocal sudo[6505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:22 np0005539509.novalocal python3[6507]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:38:22 np0005539509.novalocal sudo[6505]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:22 np0005539509.novalocal sudo[6578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfztirefmirqjkmwylirtbrxtekckvfk ; /usr/bin/python3'
Nov 29 05:38:22 np0005539509.novalocal sudo[6578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:23 np0005539509.novalocal python3[6580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764394702.4281366-382-15652868957253/source _original_basename=tmp6ffexe8a follow=False checksum=01954034105cdb65b42722894a5c1036808c70c7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:23 np0005539509.novalocal sudo[6578]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:23 np0005539509.novalocal python3[6628]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:38:24 np0005539509.novalocal python3[6654]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:38:24 np0005539509.novalocal sudo[6732]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qecvazfiimvqziplhsqztuubfcxlfllf ; /usr/bin/python3'
Nov 29 05:38:24 np0005539509.novalocal sudo[6732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:24 np0005539509.novalocal python3[6734]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:38:24 np0005539509.novalocal sudo[6732]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:24 np0005539509.novalocal sudo[6805]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovgpgunpzxrpvsioyuyckhudeldujxsp ; /usr/bin/python3'
Nov 29 05:38:24 np0005539509.novalocal sudo[6805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:25 np0005539509.novalocal python3[6807]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394704.3946762-452-211574357919580/source _original_basename=tmpy78nwvio follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:25 np0005539509.novalocal sudo[6805]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:25 np0005539509.novalocal sudo[6856]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qspwjhpjrosyiedztyobxkqgbuxukbcu ; /usr/bin/python3'
Nov 29 05:38:25 np0005539509.novalocal sudo[6856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:25 np0005539509.novalocal python3[6858]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-3d5b-5bb0-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:38:25 np0005539509.novalocal sudo[6856]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:26 np0005539509.novalocal python3[6886]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-3d5b-5bb0-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 29 05:38:27 np0005539509.novalocal python3[6914]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:46 np0005539509.novalocal sudo[6938]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsqesmebviletykfwjfrievmcsrdurba ; /usr/bin/python3'
Nov 29 05:38:46 np0005539509.novalocal sudo[6938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:38:46 np0005539509.novalocal python3[6940]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:38:46 np0005539509.novalocal sudo[6938]: pam_unix(sudo:session): session closed for user root
Nov 29 05:38:49 np0005539509.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 05:39:25 np0005539509.novalocal sshd-session[6943]: Invalid user nrk from 71.70.164.48 port 55725
Nov 29 05:39:25 np0005539509.novalocal sshd-session[6943]: Received disconnect from 71.70.164.48 port 55725:11: Bye Bye [preauth]
Nov 29 05:39:25 np0005539509.novalocal sshd-session[6943]: Disconnected from invalid user nrk 71.70.164.48 port 55725 [preauth]
Nov 29 05:39:46 np0005539509.novalocal sshd-session[4310]: Received disconnect from 38.102.83.114 port 37944:11: disconnected by user
Nov 29 05:39:46 np0005539509.novalocal sshd-session[4310]: Disconnected from user zuul 38.102.83.114 port 37944
Nov 29 05:39:46 np0005539509.novalocal sshd-session[4296]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:39:46 np0005539509.novalocal systemd-logind[785]: Session 1 logged out. Waiting for processes to exit.
Nov 29 05:39:53 np0005539509.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 05:39:53 np0005539509.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 29 05:39:53 np0005539509.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 29 05:39:53 np0005539509.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 29 05:39:53 np0005539509.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 29 05:39:53 np0005539509.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 29 05:39:53 np0005539509.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 29 05:39:53 np0005539509.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 29 05:39:53 np0005539509.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 29 05:39:53 np0005539509.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 29 05:39:53 np0005539509.novalocal NetworkManager[856]: <info>  [1764394793.1347] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 05:39:53 np0005539509.novalocal systemd-udevd[6945]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 05:39:53 np0005539509.novalocal NetworkManager[856]: <info>  [1764394793.1556] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 05:39:53 np0005539509.novalocal NetworkManager[856]: <info>  [1764394793.1590] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 29 05:39:53 np0005539509.novalocal NetworkManager[856]: <info>  [1764394793.1596] device (eth1): carrier: link connected
Nov 29 05:39:53 np0005539509.novalocal NetworkManager[856]: <info>  [1764394793.1598] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 05:39:53 np0005539509.novalocal NetworkManager[856]: <info>  [1764394793.1606] policy: auto-activating connection 'Wired connection 1' (869e6d79-5f7b-3b9f-b76e-078155d890a2)
Nov 29 05:39:53 np0005539509.novalocal NetworkManager[856]: <info>  [1764394793.1611] device (eth1): Activation: starting connection 'Wired connection 1' (869e6d79-5f7b-3b9f-b76e-078155d890a2)
Nov 29 05:39:53 np0005539509.novalocal NetworkManager[856]: <info>  [1764394793.1612] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 05:39:53 np0005539509.novalocal NetworkManager[856]: <info>  [1764394793.1615] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 05:39:53 np0005539509.novalocal NetworkManager[856]: <info>  [1764394793.1620] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 05:39:53 np0005539509.novalocal NetworkManager[856]: <info>  [1764394793.1625] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:39:53 np0005539509.novalocal systemd[4300]: Starting Mark boot as successful...
Nov 29 05:39:53 np0005539509.novalocal systemd[4300]: Finished Mark boot as successful.
Nov 29 05:39:54 np0005539509.novalocal sshd-session[6950]: Accepted publickey for zuul from 38.102.83.114 port 58948 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:39:54 np0005539509.novalocal systemd-logind[785]: New session 3 of user zuul.
Nov 29 05:39:54 np0005539509.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 29 05:39:54 np0005539509.novalocal sshd-session[6950]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:39:54 np0005539509.novalocal python3[6977]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-4e5a-44df-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:40:04 np0005539509.novalocal sudo[7055]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yriwiatdigkkxlvbkejzmqpkppmlpfei ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:40:04 np0005539509.novalocal sudo[7055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:40:04 np0005539509.novalocal python3[7057]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:40:04 np0005539509.novalocal sudo[7055]: pam_unix(sudo:session): session closed for user root
Nov 29 05:40:04 np0005539509.novalocal sudo[7128]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aztcsuvoaxtheelglyqecepeyxawzksw ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:40:04 np0005539509.novalocal sudo[7128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:40:04 np0005539509.novalocal python3[7130]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394804.1989338-155-88084161914430/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=c1fba3f03f63934d2121e957385cfe4c48be3062 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:40:05 np0005539509.novalocal sudo[7128]: pam_unix(sudo:session): session closed for user root
Nov 29 05:40:05 np0005539509.novalocal sudo[7178]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klzqqcqtpxqccssnvhvlaonqeceacvfu ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:40:05 np0005539509.novalocal sudo[7178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:40:05 np0005539509.novalocal python3[7180]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: Stopping Network Manager...
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[856]: <info>  [1764394805.5969] caught SIGTERM, shutting down normally.
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[856]: <info>  [1764394805.5979] dhcp4 (eth0): canceled DHCP transaction
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[856]: <info>  [1764394805.5979] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[856]: <info>  [1764394805.5979] dhcp4 (eth0): state changed no lease
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[856]: <info>  [1764394805.5981] manager: NetworkManager state is now CONNECTING
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[856]: <info>  [1764394805.6083] dhcp4 (eth1): canceled DHCP transaction
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[856]: <info>  [1764394805.6084] dhcp4 (eth1): state changed no lease
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[856]: <info>  [1764394805.6178] exiting (success)
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: Stopped Network Manager.
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: NetworkManager.service: Consumed 2.017s CPU time, 9.9M memory peak.
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: Starting Network Manager...
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.6967] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:ba0dbca2-f496-4536-953e-379bfe5fc9e9)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.6969] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7025] manager[0x55c7c3208070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: Starting Hostname Service...
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: Started Hostname Service.
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7745] hostname: hostname: using hostnamed
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7746] hostname: static hostname changed from (none) to "np0005539509.novalocal"
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7754] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7759] manager[0x55c7c3208070]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7760] manager[0x55c7c3208070]: rfkill: WWAN hardware radio set enabled
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7790] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7791] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7792] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7792] manager: Networking is enabled by state file
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7795] settings: Loaded settings plugin: keyfile (internal)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7800] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7828] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7838] dhcp: init: Using DHCP client 'internal'
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7840] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7846] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7852] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7861] device (lo): Activation: starting connection 'lo' (e88f289b-57af-451c-b662-f7b3e0248e91)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7869] device (eth0): carrier: link connected
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7874] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7878] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7879] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7884] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7890] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7896] device (eth1): carrier: link connected
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7900] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7905] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (869e6d79-5f7b-3b9f-b76e-078155d890a2) (indicated)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7906] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7911] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7917] device (eth1): Activation: starting connection 'Wired connection 1' (869e6d79-5f7b-3b9f-b76e-078155d890a2)
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: Started Network Manager.
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7931] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7940] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7945] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7948] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7953] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7959] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7964] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7970] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7976] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7991] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.7996] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.8011] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.8015] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:40:05 np0005539509.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.8047] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.8053] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.8064] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.8076] device (lo): Activation: successful, device activated.
Nov 29 05:40:05 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394805.8094] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 05:40:05 np0005539509.novalocal sudo[7178]: pam_unix(sudo:session): session closed for user root
Nov 29 05:40:06 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394806.0200] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 05:40:06 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394806.0239] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 05:40:06 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394806.0268] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 05:40:06 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394806.0273] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 05:40:06 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394806.0277] device (eth0): Activation: successful, device activated.
Nov 29 05:40:06 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394806.0285] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 05:40:06 np0005539509.novalocal python3[7245]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-4e5a-44df-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:40:16 np0005539509.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:40:35 np0005539509.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5285] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 05:40:51 np0005539509.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:40:51 np0005539509.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5659] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5664] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5675] device (eth1): Activation: successful, device activated.
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5685] manager: startup complete
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5689] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <warn>  [1764394851.5699] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5712] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 29 05:40:51 np0005539509.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5824] dhcp4 (eth1): canceled DHCP transaction
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5825] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5825] dhcp4 (eth1): state changed no lease
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5850] policy: auto-activating connection 'ci-private-network' (f58d442a-350a-5956-a954-8dae41cac9cb)
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5857] device (eth1): Activation: starting connection 'ci-private-network' (f58d442a-350a-5956-a954-8dae41cac9cb)
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5858] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5863] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5874] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 05:40:51 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394851.5886] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 05:40:52 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394852.3898] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 05:40:52 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394852.3903] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 05:40:52 np0005539509.novalocal NetworkManager[7192]: <info>  [1764394852.3915] device (eth1): Activation: successful, device activated.
Nov 29 05:41:02 np0005539509.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:41:06 np0005539509.novalocal sshd-session[6953]: Received disconnect from 38.102.83.114 port 58948:11: disconnected by user
Nov 29 05:41:06 np0005539509.novalocal sshd-session[6953]: Disconnected from user zuul 38.102.83.114 port 58948
Nov 29 05:41:06 np0005539509.novalocal sshd-session[6950]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:41:06 np0005539509.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 29 05:41:06 np0005539509.novalocal systemd[1]: session-3.scope: Consumed 1.856s CPU time.
Nov 29 05:41:06 np0005539509.novalocal systemd-logind[785]: Session 3 logged out. Waiting for processes to exit.
Nov 29 05:41:06 np0005539509.novalocal systemd-logind[785]: Removed session 3.
Nov 29 05:41:44 np0005539509.novalocal sshd-session[7294]: Accepted publickey for zuul from 38.102.83.114 port 59512 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:41:44 np0005539509.novalocal systemd-logind[785]: New session 4 of user zuul.
Nov 29 05:41:44 np0005539509.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 29 05:41:44 np0005539509.novalocal sshd-session[7294]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:41:44 np0005539509.novalocal sudo[7373]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlvubjpwkyoagfphavwjggcubnayzwfy ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:41:44 np0005539509.novalocal sudo[7373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:41:45 np0005539509.novalocal python3[7375]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:41:45 np0005539509.novalocal sudo[7373]: pam_unix(sudo:session): session closed for user root
Nov 29 05:41:45 np0005539509.novalocal sudo[7446]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwltydphxpljehuoobzyliqzrvekjqrb ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:41:45 np0005539509.novalocal sudo[7446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:41:45 np0005539509.novalocal python3[7448]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394904.7427914-373-147243090404980/source _original_basename=tmp38mmtp3x follow=False checksum=95c43167cb69fbe3f3b9eff0c3ecf63c2bbd5b70 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:41:45 np0005539509.novalocal sudo[7446]: pam_unix(sudo:session): session closed for user root
Nov 29 05:41:48 np0005539509.novalocal sshd-session[7297]: Connection closed by 38.102.83.114 port 59512
Nov 29 05:41:48 np0005539509.novalocal sshd-session[7294]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:41:48 np0005539509.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 29 05:41:48 np0005539509.novalocal systemd-logind[785]: Session 4 logged out. Waiting for processes to exit.
Nov 29 05:41:48 np0005539509.novalocal systemd-logind[785]: Removed session 4.
Nov 29 05:43:05 np0005539509.novalocal systemd[4300]: Created slice User Background Tasks Slice.
Nov 29 05:43:05 np0005539509.novalocal systemd[4300]: Starting Cleanup of User's Temporary Files and Directories...
Nov 29 05:43:05 np0005539509.novalocal systemd[4300]: Finished Cleanup of User's Temporary Files and Directories.
Nov 29 05:43:36 np0005539509.novalocal sshd-session[7476]: Invalid user bounce from 71.70.164.48 port 57960
Nov 29 05:43:36 np0005539509.novalocal sshd-session[7476]: Received disconnect from 71.70.164.48 port 57960:11: Bye Bye [preauth]
Nov 29 05:43:36 np0005539509.novalocal sshd-session[7476]: Disconnected from invalid user bounce 71.70.164.48 port 57960 [preauth]
Nov 29 05:46:02 np0005539509.novalocal sshd-session[7479]: Received disconnect from 71.70.164.48 port 52132:11: Bye Bye [preauth]
Nov 29 05:46:02 np0005539509.novalocal sshd-session[7479]: Disconnected from authenticating user root 71.70.164.48 port 52132 [preauth]
Nov 29 05:47:03 np0005539509.novalocal sshd-session[7483]: Accepted publickey for zuul from 38.102.83.114 port 51878 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:47:03 np0005539509.novalocal systemd-logind[785]: New session 5 of user zuul.
Nov 29 05:47:03 np0005539509.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 29 05:47:03 np0005539509.novalocal sshd-session[7483]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:47:03 np0005539509.novalocal sudo[7510]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqqsyuxbbmktrggbczpfsntjkpsxqqqa ; /usr/bin/python3'
Nov 29 05:47:03 np0005539509.novalocal sudo[7510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:03 np0005539509.novalocal python3[7512]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-b110-1686-000000000ca2-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:03 np0005539509.novalocal sudo[7510]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:04 np0005539509.novalocal sudo[7539]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grkrvkxdsobqelcpvadpjpywddpuonpf ; /usr/bin/python3'
Nov 29 05:47:04 np0005539509.novalocal sudo[7539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:04 np0005539509.novalocal python3[7541]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:04 np0005539509.novalocal sudo[7539]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:04 np0005539509.novalocal sudo[7565]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wavkpwkyfjfvxtqbtoopiahnadnzjmjh ; /usr/bin/python3'
Nov 29 05:47:04 np0005539509.novalocal sudo[7565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:04 np0005539509.novalocal python3[7567]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:04 np0005539509.novalocal sudo[7565]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:04 np0005539509.novalocal sudo[7591]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frrlgbakirgopkkztmpjdarsuqwfaswk ; /usr/bin/python3'
Nov 29 05:47:04 np0005539509.novalocal sudo[7591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:04 np0005539509.novalocal python3[7593]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:04 np0005539509.novalocal sudo[7591]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:04 np0005539509.novalocal sudo[7617]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdpbwnnwoihjwlkbtrnttqnghzcnglnr ; /usr/bin/python3'
Nov 29 05:47:04 np0005539509.novalocal sudo[7617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:05 np0005539509.novalocal python3[7619]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:05 np0005539509.novalocal sudo[7617]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:05 np0005539509.novalocal sudo[7643]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgddxebryellahstzmxbzvybihznuetu ; /usr/bin/python3'
Nov 29 05:47:05 np0005539509.novalocal sudo[7643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:05 np0005539509.novalocal python3[7645]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:05 np0005539509.novalocal sudo[7643]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:06 np0005539509.novalocal sudo[7721]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttjirxavppzehfubrmqmneeqehjxfaca ; /usr/bin/python3'
Nov 29 05:47:06 np0005539509.novalocal sudo[7721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:06 np0005539509.novalocal python3[7723]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:47:06 np0005539509.novalocal sudo[7721]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:06 np0005539509.novalocal sudo[7794]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmsorepaeckixnuseefqpncrthpycvjx ; /usr/bin/python3'
Nov 29 05:47:06 np0005539509.novalocal sudo[7794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:06 np0005539509.novalocal python3[7796]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395225.862213-366-176894584212579/source _original_basename=tmp9avxdlxn follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:06 np0005539509.novalocal sudo[7794]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:07 np0005539509.novalocal sudo[7844]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yovlpwntifssubaesthmbvhmzxkzvnxf ; /usr/bin/python3'
Nov 29 05:47:07 np0005539509.novalocal sudo[7844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:07 np0005539509.novalocal python3[7846]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 05:47:07 np0005539509.novalocal systemd[1]: Reloading.
Nov 29 05:47:07 np0005539509.novalocal systemd-rc-local-generator[7864]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 05:47:08 np0005539509.novalocal sudo[7844]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:09 np0005539509.novalocal sudo[7900]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqmiotlonalmexfzvzhxjkzjgurhccgq ; /usr/bin/python3'
Nov 29 05:47:09 np0005539509.novalocal sudo[7900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:09 np0005539509.novalocal python3[7902]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 29 05:47:09 np0005539509.novalocal sudo[7900]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:09 np0005539509.novalocal sudo[7926]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojwbreomavoxjausdtftskxgvgthvshf ; /usr/bin/python3'
Nov 29 05:47:09 np0005539509.novalocal sudo[7926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:10 np0005539509.novalocal python3[7928]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:10 np0005539509.novalocal sudo[7926]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:10 np0005539509.novalocal sudo[7954]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yicjfkaqwtpohsoujqbjocdszfnjbtpf ; /usr/bin/python3'
Nov 29 05:47:10 np0005539509.novalocal sudo[7954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:10 np0005539509.novalocal python3[7956]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:10 np0005539509.novalocal sudo[7954]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:10 np0005539509.novalocal sudo[7982]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttxdwbqyakhzjzfrgpdwxwjblzjuocyq ; /usr/bin/python3'
Nov 29 05:47:10 np0005539509.novalocal sudo[7982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:10 np0005539509.novalocal python3[7984]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:10 np0005539509.novalocal sudo[7982]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:10 np0005539509.novalocal sudo[8010]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvoufqucohumtpmxddprqmohtbwnqehk ; /usr/bin/python3'
Nov 29 05:47:10 np0005539509.novalocal sudo[8010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:11 np0005539509.novalocal python3[8012]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:11 np0005539509.novalocal sudo[8010]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:11 np0005539509.novalocal python3[8039]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-b110-1686-000000000ca9-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:12 np0005539509.novalocal python3[8069]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 29 05:47:15 np0005539509.novalocal sshd-session[7486]: Connection closed by 38.102.83.114 port 51878
Nov 29 05:47:15 np0005539509.novalocal sshd-session[7483]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:47:15 np0005539509.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Nov 29 05:47:15 np0005539509.novalocal systemd[1]: session-5.scope: Consumed 4.730s CPU time.
Nov 29 05:47:15 np0005539509.novalocal systemd-logind[785]: Session 5 logged out. Waiting for processes to exit.
Nov 29 05:47:15 np0005539509.novalocal systemd-logind[785]: Removed session 5.
Nov 29 05:47:16 np0005539509.novalocal sshd-session[8073]: Accepted publickey for zuul from 38.102.83.114 port 46846 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:47:16 np0005539509.novalocal systemd-logind[785]: New session 6 of user zuul.
Nov 29 05:47:16 np0005539509.novalocal systemd[1]: Started Session 6 of User zuul.
Nov 29 05:47:16 np0005539509.novalocal sshd-session[8073]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:47:16 np0005539509.novalocal sudo[8100]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiyvwrgvgmatmdahgfpejsfcqxoicioj ; /usr/bin/python3'
Nov 29 05:47:16 np0005539509.novalocal sudo[8100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:17 np0005539509.novalocal python3[8102]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 29 05:47:30 np0005539509.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 29 05:47:30 np0005539509.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:47:30 np0005539509.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:47:30 np0005539509.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:47:30 np0005539509.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:47:30 np0005539509.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:47:30 np0005539509.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:47:30 np0005539509.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:47:39 np0005539509.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 29 05:47:39 np0005539509.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:47:39 np0005539509.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:47:39 np0005539509.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:47:39 np0005539509.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:47:39 np0005539509.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:47:39 np0005539509.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:47:39 np0005539509.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:47:48 np0005539509.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 29 05:47:48 np0005539509.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:47:48 np0005539509.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:47:48 np0005539509.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:47:48 np0005539509.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:47:48 np0005539509.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:47:48 np0005539509.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:47:48 np0005539509.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:47:49 np0005539509.novalocal setsebool[8169]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 29 05:47:49 np0005539509.novalocal setsebool[8169]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 29 05:48:00 np0005539509.novalocal kernel: SELinux:  Converting 388 SID table entries...
Nov 29 05:48:00 np0005539509.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:48:00 np0005539509.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:48:00 np0005539509.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:48:00 np0005539509.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:48:00 np0005539509.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:48:00 np0005539509.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:48:00 np0005539509.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:48:12 np0005539509.novalocal sshd-session[8883]: Received disconnect from 71.70.164.48 port 45505:11: Bye Bye [preauth]
Nov 29 05:48:12 np0005539509.novalocal sshd-session[8883]: Disconnected from authenticating user root 71.70.164.48 port 45505 [preauth]
Nov 29 05:48:18 np0005539509.novalocal dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 05:48:18 np0005539509.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 05:48:18 np0005539509.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 29 05:48:18 np0005539509.novalocal systemd[1]: Reloading.
Nov 29 05:48:18 np0005539509.novalocal systemd-rc-local-generator[8928]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 05:48:18 np0005539509.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 05:48:20 np0005539509.novalocal sudo[8100]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:24 np0005539509.novalocal python3[12661]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-4d52-d96a-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:48:25 np0005539509.novalocal kernel: evm: overlay not supported
Nov 29 05:48:25 np0005539509.novalocal systemd[4300]: Starting D-Bus User Message Bus...
Nov 29 05:48:25 np0005539509.novalocal dbus-broker-launch[13389]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 29 05:48:25 np0005539509.novalocal dbus-broker-launch[13389]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 29 05:48:25 np0005539509.novalocal systemd[4300]: Started D-Bus User Message Bus.
Nov 29 05:48:25 np0005539509.novalocal dbus-broker-lau[13389]: Ready
Nov 29 05:48:25 np0005539509.novalocal systemd[4300]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 05:48:25 np0005539509.novalocal systemd[4300]: Created slice Slice /user.
Nov 29 05:48:25 np0005539509.novalocal systemd[4300]: podman-13294.scope: unit configures an IP firewall, but not running as root.
Nov 29 05:48:25 np0005539509.novalocal systemd[4300]: (This warning is only shown for the first unit using IP firewalling.)
Nov 29 05:48:25 np0005539509.novalocal systemd[4300]: Started podman-13294.scope.
Nov 29 05:48:25 np0005539509.novalocal systemd[4300]: Started podman-pause-5b7b0f92.scope.
Nov 29 05:48:26 np0005539509.novalocal sudo[13900]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zafbnuyajpusqmsspwpqewyetcveddom ; /usr/bin/python3'
Nov 29 05:48:26 np0005539509.novalocal sudo[13900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:26 np0005539509.novalocal python3[13916]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.97:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.97:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:48:26 np0005539509.novalocal python3[13916]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 29 05:48:26 np0005539509.novalocal sudo[13900]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:26 np0005539509.novalocal sshd-session[8076]: Connection closed by 38.102.83.114 port 46846
Nov 29 05:48:26 np0005539509.novalocal sshd-session[8073]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:48:26 np0005539509.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Nov 29 05:48:26 np0005539509.novalocal systemd[1]: session-6.scope: Consumed 59.272s CPU time.
Nov 29 05:48:26 np0005539509.novalocal systemd-logind[785]: Session 6 logged out. Waiting for processes to exit.
Nov 29 05:48:26 np0005539509.novalocal systemd-logind[785]: Removed session 6.
Nov 29 05:48:47 np0005539509.novalocal sshd-session[20995]: Connection closed by 38.102.83.107 port 37870 [preauth]
Nov 29 05:48:47 np0005539509.novalocal sshd-session[21000]: Connection closed by 38.102.83.107 port 37880 [preauth]
Nov 29 05:48:47 np0005539509.novalocal sshd-session[20997]: Unable to negotiate with 38.102.83.107 port 37896: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 29 05:48:47 np0005539509.novalocal sshd-session[21002]: Unable to negotiate with 38.102.83.107 port 37910: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 29 05:48:47 np0005539509.novalocal sshd-session[21003]: Unable to negotiate with 38.102.83.107 port 37920: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 29 05:48:52 np0005539509.novalocal sshd-session[22537]: Accepted publickey for zuul from 38.102.83.114 port 36032 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:48:52 np0005539509.novalocal systemd-logind[785]: New session 7 of user zuul.
Nov 29 05:48:52 np0005539509.novalocal systemd[1]: Started Session 7 of User zuul.
Nov 29 05:48:52 np0005539509.novalocal sshd-session[22537]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:48:52 np0005539509.novalocal irqbalance[781]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 29 05:48:52 np0005539509.novalocal irqbalance[781]: IRQ 27 affinity is now unmanaged
Nov 29 05:48:52 np0005539509.novalocal python3[22637]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:48:53 np0005539509.novalocal sudo[22816]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnpnppglynuudqeibvyqsrwtoiafqooi ; /usr/bin/python3'
Nov 29 05:48:53 np0005539509.novalocal sudo[22816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:53 np0005539509.novalocal python3[22826]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:48:53 np0005539509.novalocal sudo[22816]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:54 np0005539509.novalocal sudo[23118]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aigkskyklqvoyxkujlvibqvnwakpfgds ; /usr/bin/python3'
Nov 29 05:48:54 np0005539509.novalocal sudo[23118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:54 np0005539509.novalocal python3[23128]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005539509.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 29 05:48:54 np0005539509.novalocal useradd[23186]: new group: name=cloud-admin, GID=1002
Nov 29 05:48:54 np0005539509.novalocal useradd[23186]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Nov 29 05:48:54 np0005539509.novalocal sudo[23118]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:54 np0005539509.novalocal sudo[23316]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trdfbaivdvioreeuihltomddfyzfvdpz ; /usr/bin/python3'
Nov 29 05:48:54 np0005539509.novalocal sudo[23316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:54 np0005539509.novalocal python3[23326]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:48:54 np0005539509.novalocal sudo[23316]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:55 np0005539509.novalocal sudo[23554]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbnvkgaczvnzuzhurkwcnptzpvrolxle ; /usr/bin/python3'
Nov 29 05:48:55 np0005539509.novalocal sudo[23554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:55 np0005539509.novalocal python3[23565]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:48:55 np0005539509.novalocal sudo[23554]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:55 np0005539509.novalocal sudo[23779]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwkwtwdvmkftbxykjiqonwrvzeesnptw ; /usr/bin/python3'
Nov 29 05:48:55 np0005539509.novalocal sudo[23779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:55 np0005539509.novalocal python3[23786]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395335.1038623-168-10431261003854/source _original_basename=tmpnk1t0ssr follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:48:55 np0005539509.novalocal sudo[23779]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:56 np0005539509.novalocal sudo[24086]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akkcwnbffqawjwktibkvtaiabzairgmc ; /usr/bin/python3'
Nov 29 05:48:56 np0005539509.novalocal sudo[24086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:56 np0005539509.novalocal python3[24095]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Nov 29 05:48:56 np0005539509.novalocal systemd[1]: Starting Hostname Service...
Nov 29 05:48:56 np0005539509.novalocal systemd[1]: Started Hostname Service.
Nov 29 05:48:57 np0005539509.novalocal systemd-hostnamed[24185]: Changed pretty hostname to 'compute-1'
Nov 29 05:48:57 compute-1 systemd-hostnamed[24185]: Hostname set to <compute-1> (static)
Nov 29 05:48:57 compute-1 NetworkManager[7192]: <info>  [1764395337.0309] hostname: static hostname changed from "np0005539509.novalocal" to "compute-1"
Nov 29 05:48:57 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:48:57 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:48:57 compute-1 sudo[24086]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:57 compute-1 sshd-session[22585]: Connection closed by 38.102.83.114 port 36032
Nov 29 05:48:57 compute-1 sshd-session[22537]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:48:57 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Nov 29 05:48:57 compute-1 systemd[1]: session-7.scope: Consumed 2.720s CPU time.
Nov 29 05:48:57 compute-1 systemd-logind[785]: Session 7 logged out. Waiting for processes to exit.
Nov 29 05:48:57 compute-1 systemd-logind[785]: Removed session 7.
Nov 29 05:49:07 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:49:15 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 05:49:15 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 29 05:49:15 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1min 7.734s CPU time.
Nov 29 05:49:15 compute-1 systemd[1]: run-r02812b7a0b4548359c0b36bd756f2b3e.service: Deactivated successfully.
Nov 29 05:49:27 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 05:50:02 compute-1 sshd-session[29920]: Invalid user ntps from 45.55.249.98 port 58638
Nov 29 05:50:02 compute-1 sshd-session[29920]: Received disconnect from 45.55.249.98 port 58638:11: Bye Bye [preauth]
Nov 29 05:50:02 compute-1 sshd-session[29920]: Disconnected from invalid user ntps 45.55.249.98 port 58638 [preauth]
Nov 29 05:50:26 compute-1 sshd-session[29923]: Invalid user sonarqube from 71.70.164.48 port 39007
Nov 29 05:50:26 compute-1 sshd-session[29923]: Received disconnect from 71.70.164.48 port 39007:11: Bye Bye [preauth]
Nov 29 05:50:26 compute-1 sshd-session[29923]: Disconnected from invalid user sonarqube 71.70.164.48 port 39007 [preauth]
Nov 29 05:50:55 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 29 05:50:55 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 29 05:50:55 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 29 05:50:55 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 29 05:52:23 compute-1 sshd-session[29932]: Invalid user Test from 118.194.230.250 port 47980
Nov 29 05:52:23 compute-1 sshd-session[29932]: Received disconnect from 118.194.230.250 port 47980:11: Bye Bye [preauth]
Nov 29 05:52:23 compute-1 sshd-session[29932]: Disconnected from invalid user Test 118.194.230.250 port 47980 [preauth]
Nov 29 05:52:42 compute-1 sshd-session[29934]: Invalid user client from 71.70.164.48 port 33376
Nov 29 05:52:42 compute-1 sshd-session[29934]: Received disconnect from 71.70.164.48 port 33376:11: Bye Bye [preauth]
Nov 29 05:52:42 compute-1 sshd-session[29934]: Disconnected from invalid user client 71.70.164.48 port 33376 [preauth]
Nov 29 05:53:01 compute-1 sshd-session[29936]: Accepted publickey for zuul from 38.102.83.107 port 35354 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 05:53:01 compute-1 systemd-logind[785]: New session 8 of user zuul.
Nov 29 05:53:01 compute-1 systemd[1]: Started Session 8 of User zuul.
Nov 29 05:53:01 compute-1 sshd-session[29936]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:53:02 compute-1 python3[30012]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:53:04 compute-1 sudo[30126]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjvdiambxxzpkrrqlcelohyguhikufgk ; /usr/bin/python3'
Nov 29 05:53:04 compute-1 sudo[30126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:04 compute-1 python3[30128]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:04 compute-1 sudo[30126]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:04 compute-1 sudo[30199]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-przucwdhunipnvpmtuphsivufqlsxglu ; /usr/bin/python3'
Nov 29 05:53:04 compute-1 sudo[30199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:04 compute-1 python3[30201]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=delorean.repo follow=False checksum=a16f090252000d02a7f7d540bb10f7c1c9cd4ac5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:04 compute-1 sudo[30199]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:04 compute-1 sudo[30225]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geuzsjogabcbbvgdvfltntzuujqbxifl ; /usr/bin/python3'
Nov 29 05:53:04 compute-1 sudo[30225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:05 compute-1 python3[30227]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:05 compute-1 sudo[30225]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:05 compute-1 sudo[30298]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvlvzjqyxaxtfqqtqinhcfendlrgvjia ; /usr/bin/python3'
Nov 29 05:53:05 compute-1 sudo[30298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:05 compute-1 python3[30300]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:05 compute-1 sudo[30298]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:05 compute-1 sudo[30324]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naxfazfodeovbsauhvfjdhdvwswwqbsz ; /usr/bin/python3'
Nov 29 05:53:05 compute-1 sudo[30324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:05 compute-1 python3[30326]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:05 compute-1 sudo[30324]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:06 compute-1 sudo[30397]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xafjeffjsxwocaopcfxbrxglkpdyvzad ; /usr/bin/python3'
Nov 29 05:53:06 compute-1 sudo[30397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:06 compute-1 python3[30399]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:06 compute-1 sudo[30397]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:06 compute-1 sudo[30423]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuhihrpqlafembkqiikmxlutagxyasit ; /usr/bin/python3'
Nov 29 05:53:06 compute-1 sudo[30423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:06 compute-1 python3[30425]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:06 compute-1 sudo[30423]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:06 compute-1 sudo[30496]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyuypfzwkfmvlrtugugcvwescfyyqfck ; /usr/bin/python3'
Nov 29 05:53:06 compute-1 sudo[30496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:07 compute-1 python3[30498]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:07 compute-1 sudo[30496]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:07 compute-1 sudo[30522]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpndlfnrowwofyazlrfafoqygpruqrmg ; /usr/bin/python3'
Nov 29 05:53:07 compute-1 sudo[30522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:07 compute-1 python3[30524]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:07 compute-1 sudo[30522]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:07 compute-1 sudo[30595]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpvmnfgkonizkegspncsyuglnfvdtwjr ; /usr/bin/python3'
Nov 29 05:53:07 compute-1 sudo[30595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:07 compute-1 python3[30597]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:07 compute-1 sudo[30595]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:07 compute-1 sudo[30621]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iocuqupneyhxozgicfwzfxqsuhutzcbo ; /usr/bin/python3'
Nov 29 05:53:07 compute-1 sudo[30621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:07 compute-1 python3[30623]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:07 compute-1 sudo[30621]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:08 compute-1 sudo[30694]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yendsnstykcvodejsbqfettbbivusnrv ; /usr/bin/python3'
Nov 29 05:53:08 compute-1 sudo[30694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:08 compute-1 python3[30696]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:08 compute-1 sudo[30694]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:08 compute-1 sudo[30720]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctzuymmrghgsnznzzvwcdtgffazrjjgc ; /usr/bin/python3'
Nov 29 05:53:08 compute-1 sudo[30720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:08 compute-1 python3[30722]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:53:08 compute-1 sudo[30720]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:09 compute-1 sudo[30793]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnpfajnpdjobjkxgrnowxjpitelvocjq ; /usr/bin/python3'
Nov 29 05:53:09 compute-1 sudo[30793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:53:09 compute-1 python3[30795]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=25e801a9a05537c191e2aa500f19076ac31d3e5b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:53:09 compute-1 sudo[30793]: pam_unix(sudo:session): session closed for user root
Nov 29 05:53:20 compute-1 python3[30844]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:53:23 compute-1 sshd-session[30846]: Received disconnect from 45.55.249.98 port 44390:11: Bye Bye [preauth]
Nov 29 05:53:23 compute-1 sshd-session[30846]: Disconnected from authenticating user root 45.55.249.98 port 44390 [preauth]
Nov 29 05:54:22 compute-1 sshd-session[30848]: Invalid user rancher from 118.194.230.250 port 48122
Nov 29 05:54:22 compute-1 sshd-session[30848]: Received disconnect from 118.194.230.250 port 48122:11: Bye Bye [preauth]
Nov 29 05:54:22 compute-1 sshd-session[30848]: Disconnected from invalid user rancher 118.194.230.250 port 48122 [preauth]
Nov 29 05:54:26 compute-1 sshd-session[30850]: Received disconnect from 45.55.249.98 port 53468:11: Bye Bye [preauth]
Nov 29 05:54:26 compute-1 sshd-session[30850]: Disconnected from authenticating user root 45.55.249.98 port 53468 [preauth]
Nov 29 05:54:51 compute-1 sshd-session[30852]: Invalid user hello from 71.70.164.48 port 59500
Nov 29 05:54:51 compute-1 sshd-session[30852]: Received disconnect from 71.70.164.48 port 59500:11: Bye Bye [preauth]
Nov 29 05:54:51 compute-1 sshd-session[30852]: Disconnected from invalid user hello 71.70.164.48 port 59500 [preauth]
Nov 29 05:55:35 compute-1 sshd-session[30854]: Received disconnect from 45.55.249.98 port 42904:11: Bye Bye [preauth]
Nov 29 05:55:35 compute-1 sshd-session[30854]: Disconnected from authenticating user root 45.55.249.98 port 42904 [preauth]
Nov 29 05:55:41 compute-1 sshd-session[30856]: Invalid user ariel from 118.194.230.250 port 48226
Nov 29 05:55:41 compute-1 sshd-session[30856]: Received disconnect from 118.194.230.250 port 48226:11: Bye Bye [preauth]
Nov 29 05:55:41 compute-1 sshd-session[30856]: Disconnected from invalid user ariel 118.194.230.250 port 48226 [preauth]
Nov 29 05:56:45 compute-1 sshd-session[30858]: Received disconnect from 45.55.249.98 port 33792:11: Bye Bye [preauth]
Nov 29 05:56:45 compute-1 sshd-session[30858]: Disconnected from authenticating user root 45.55.249.98 port 33792 [preauth]
Nov 29 05:56:59 compute-1 sshd-session[30860]: Invalid user arkserver from 118.194.230.250 port 48326
Nov 29 05:56:59 compute-1 sshd-session[30860]: Received disconnect from 118.194.230.250 port 48326:11: Bye Bye [preauth]
Nov 29 05:56:59 compute-1 sshd-session[30860]: Disconnected from invalid user arkserver 118.194.230.250 port 48326 [preauth]
Nov 29 05:57:14 compute-1 sshd-session[30862]: Invalid user ts1 from 71.70.164.48 port 58385
Nov 29 05:57:14 compute-1 sshd-session[30862]: Received disconnect from 71.70.164.48 port 58385:11: Bye Bye [preauth]
Nov 29 05:57:14 compute-1 sshd-session[30862]: Disconnected from invalid user ts1 71.70.164.48 port 58385 [preauth]
Nov 29 05:57:47 compute-1 sshd-session[30866]: Invalid user user5 from 45.55.249.98 port 59136
Nov 29 05:57:47 compute-1 sshd-session[30866]: Received disconnect from 45.55.249.98 port 59136:11: Bye Bye [preauth]
Nov 29 05:57:47 compute-1 sshd-session[30866]: Disconnected from invalid user user5 45.55.249.98 port 59136 [preauth]
Nov 29 05:58:16 compute-1 sshd-session[30869]: Received disconnect from 118.194.230.250 port 48424:11: Bye Bye [preauth]
Nov 29 05:58:16 compute-1 sshd-session[30869]: Disconnected from authenticating user daemon 118.194.230.250 port 48424 [preauth]
Nov 29 05:58:19 compute-1 sshd-session[29939]: Received disconnect from 38.102.83.107 port 35354:11: disconnected by user
Nov 29 05:58:19 compute-1 sshd-session[29939]: Disconnected from user zuul 38.102.83.107 port 35354
Nov 29 05:58:19 compute-1 sshd-session[29936]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:58:20 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Nov 29 05:58:20 compute-1 systemd[1]: session-8.scope: Consumed 5.952s CPU time.
Nov 29 05:58:20 compute-1 systemd-logind[785]: Session 8 logged out. Waiting for processes to exit.
Nov 29 05:58:20 compute-1 systemd-logind[785]: Removed session 8.
Nov 29 05:58:51 compute-1 sshd-session[30871]: Received disconnect from 45.55.249.98 port 36506:11: Bye Bye [preauth]
Nov 29 05:58:51 compute-1 sshd-session[30871]: Disconnected from authenticating user root 45.55.249.98 port 36506 [preauth]
Nov 29 05:59:31 compute-1 sshd-session[30873]: Invalid user deploy from 118.194.230.250 port 48522
Nov 29 05:59:31 compute-1 sshd-session[30873]: Received disconnect from 118.194.230.250 port 48522:11: Bye Bye [preauth]
Nov 29 05:59:31 compute-1 sshd-session[30873]: Disconnected from invalid user deploy 118.194.230.250 port 48522 [preauth]
Nov 29 05:59:33 compute-1 sshd-session[30875]: Invalid user qw from 71.70.164.48 port 58225
Nov 29 05:59:33 compute-1 sshd-session[30875]: Received disconnect from 71.70.164.48 port 58225:11: Bye Bye [preauth]
Nov 29 05:59:33 compute-1 sshd-session[30875]: Disconnected from invalid user qw 71.70.164.48 port 58225 [preauth]
Nov 29 05:59:49 compute-1 sshd-session[30877]: Invalid user ec2-user from 45.55.249.98 port 49530
Nov 29 05:59:49 compute-1 sshd-session[30877]: Received disconnect from 45.55.249.98 port 49530:11: Bye Bye [preauth]
Nov 29 05:59:49 compute-1 sshd-session[30877]: Disconnected from invalid user ec2-user 45.55.249.98 port 49530 [preauth]
Nov 29 06:00:42 compute-1 sshd-session[30879]: Invalid user hu from 118.194.230.250 port 48618
Nov 29 06:00:42 compute-1 sshd-session[30879]: Received disconnect from 118.194.230.250 port 48618:11: Bye Bye [preauth]
Nov 29 06:00:42 compute-1 sshd-session[30879]: Disconnected from invalid user hu 118.194.230.250 port 48618 [preauth]
Nov 29 06:00:57 compute-1 sshd-session[30881]: Received disconnect from 45.55.249.98 port 44278:11: Bye Bye [preauth]
Nov 29 06:00:57 compute-1 sshd-session[30881]: Disconnected from authenticating user root 45.55.249.98 port 44278 [preauth]
Nov 29 06:01:01 compute-1 CROND[30884]: (root) CMD (run-parts /etc/cron.hourly)
Nov 29 06:01:01 compute-1 run-parts[30887]: (/etc/cron.hourly) starting 0anacron
Nov 29 06:01:01 compute-1 anacron[30895]: Anacron started on 2025-11-29
Nov 29 06:01:01 compute-1 anacron[30895]: Will run job `cron.daily' in 24 min.
Nov 29 06:01:01 compute-1 anacron[30895]: Will run job `cron.weekly' in 44 min.
Nov 29 06:01:01 compute-1 anacron[30895]: Will run job `cron.monthly' in 64 min.
Nov 29 06:01:01 compute-1 anacron[30895]: Jobs will be executed sequentially
Nov 29 06:01:01 compute-1 run-parts[30897]: (/etc/cron.hourly) finished 0anacron
Nov 29 06:01:01 compute-1 CROND[30883]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 29 06:01:49 compute-1 sshd-session[30898]: Received disconnect from 71.70.164.48 port 58184:11: Bye Bye [preauth]
Nov 29 06:01:49 compute-1 sshd-session[30898]: Disconnected from authenticating user root 71.70.164.48 port 58184 [preauth]
Nov 29 06:01:54 compute-1 sshd-session[30900]: Invalid user kafka from 118.194.230.250 port 48718
Nov 29 06:01:54 compute-1 sshd-session[30900]: Received disconnect from 118.194.230.250 port 48718:11: Bye Bye [preauth]
Nov 29 06:01:54 compute-1 sshd-session[30900]: Disconnected from invalid user kafka 118.194.230.250 port 48718 [preauth]
Nov 29 06:01:57 compute-1 sshd-session[30902]: Invalid user tuan from 45.55.249.98 port 40382
Nov 29 06:01:57 compute-1 sshd-session[30902]: Received disconnect from 45.55.249.98 port 40382:11: Bye Bye [preauth]
Nov 29 06:01:57 compute-1 sshd-session[30902]: Disconnected from invalid user tuan 45.55.249.98 port 40382 [preauth]
Nov 29 06:03:04 compute-1 sshd-session[30905]: Invalid user erpnext from 45.55.249.98 port 52750
Nov 29 06:03:04 compute-1 sshd-session[30905]: Received disconnect from 45.55.249.98 port 52750:11: Bye Bye [preauth]
Nov 29 06:03:04 compute-1 sshd-session[30905]: Disconnected from invalid user erpnext 45.55.249.98 port 52750 [preauth]
Nov 29 06:03:11 compute-1 sshd-session[30907]: Received disconnect from 118.194.230.250 port 48820:11: Bye Bye [preauth]
Nov 29 06:03:11 compute-1 sshd-session[30907]: Disconnected from authenticating user root 118.194.230.250 port 48820 [preauth]
Nov 29 06:04:09 compute-1 sshd-session[30910]: Invalid user vagrant from 71.70.164.48 port 57604
Nov 29 06:04:09 compute-1 sshd-session[30910]: Received disconnect from 71.70.164.48 port 57604:11: Bye Bye [preauth]
Nov 29 06:04:09 compute-1 sshd-session[30910]: Disconnected from invalid user vagrant 71.70.164.48 port 57604 [preauth]
Nov 29 06:04:14 compute-1 sshd-session[30912]: Received disconnect from 45.55.249.98 port 53584:11: Bye Bye [preauth]
Nov 29 06:04:14 compute-1 sshd-session[30912]: Disconnected from authenticating user root 45.55.249.98 port 53584 [preauth]
Nov 29 06:04:27 compute-1 sshd-session[30915]: Invalid user centos from 118.194.230.250 port 48924
Nov 29 06:04:27 compute-1 sshd-session[30915]: Received disconnect from 118.194.230.250 port 48924:11: Bye Bye [preauth]
Nov 29 06:04:27 compute-1 sshd-session[30915]: Disconnected from invalid user centos 118.194.230.250 port 48924 [preauth]
Nov 29 06:05:29 compute-1 sshd-session[30918]: Received disconnect from 45.55.249.98 port 50234:11: Bye Bye [preauth]
Nov 29 06:05:29 compute-1 sshd-session[30918]: Disconnected from authenticating user root 45.55.249.98 port 50234 [preauth]
Nov 29 06:05:45 compute-1 sshd-session[30920]: Received disconnect from 118.194.230.250 port 49034:11: Bye Bye [preauth]
Nov 29 06:05:45 compute-1 sshd-session[30920]: Disconnected from authenticating user root 118.194.230.250 port 49034 [preauth]
Nov 29 06:06:21 compute-1 sshd-session[30923]: Invalid user khan from 71.70.164.48 port 56555
Nov 29 06:06:21 compute-1 sshd-session[30923]: Received disconnect from 71.70.164.48 port 56555:11: Bye Bye [preauth]
Nov 29 06:06:21 compute-1 sshd-session[30923]: Disconnected from invalid user khan 71.70.164.48 port 56555 [preauth]
Nov 29 06:06:27 compute-1 sshd-session[30925]: Accepted publickey for zuul from 192.168.122.30 port 40074 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:06:27 compute-1 systemd-logind[785]: New session 9 of user zuul.
Nov 29 06:06:27 compute-1 systemd[1]: Started Session 9 of User zuul.
Nov 29 06:06:27 compute-1 sshd-session[30925]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:06:28 compute-1 python3.9[31078]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:06:29 compute-1 sudo[31257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkjerbfosasabykhormarbrhefestsfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396388.8744407-62-130113378615928/AnsiballZ_command.py'
Nov 29 06:06:29 compute-1 sudo[31257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:06:29 compute-1 python3.9[31259]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:06:38 compute-1 sudo[31257]: pam_unix(sudo:session): session closed for user root
Nov 29 06:06:38 compute-1 sshd-session[30928]: Connection closed by 192.168.122.30 port 40074
Nov 29 06:06:38 compute-1 sshd-session[30925]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:06:38 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Nov 29 06:06:38 compute-1 systemd[1]: session-9.scope: Consumed 8.530s CPU time.
Nov 29 06:06:38 compute-1 systemd-logind[785]: Session 9 logged out. Waiting for processes to exit.
Nov 29 06:06:38 compute-1 systemd-logind[785]: Removed session 9.
Nov 29 06:06:40 compute-1 sshd-session[31318]: Invalid user user2 from 45.55.249.98 port 55558
Nov 29 06:06:40 compute-1 sshd-session[31318]: Received disconnect from 45.55.249.98 port 55558:11: Bye Bye [preauth]
Nov 29 06:06:40 compute-1 sshd-session[31318]: Disconnected from invalid user user2 45.55.249.98 port 55558 [preauth]
Nov 29 06:06:54 compute-1 sshd-session[31320]: Accepted publickey for zuul from 192.168.122.30 port 46280 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:06:54 compute-1 systemd-logind[785]: New session 10 of user zuul.
Nov 29 06:06:54 compute-1 systemd[1]: Started Session 10 of User zuul.
Nov 29 06:06:54 compute-1 sshd-session[31320]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:06:55 compute-1 python3.9[31473]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 29 06:06:56 compute-1 python3.9[31647]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:06:57 compute-1 sudo[31797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npuoaihppnlylnmrmgsizzvtpvipylpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396417.227247-99-229535229529709/AnsiballZ_command.py'
Nov 29 06:06:57 compute-1 sudo[31797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:06:57 compute-1 python3.9[31799]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:06:57 compute-1 sudo[31797]: pam_unix(sudo:session): session closed for user root
Nov 29 06:06:59 compute-1 sudo[31950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuchejetxebjutffptozecflsquyperj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396418.4801176-135-165012811407127/AnsiballZ_stat.py'
Nov 29 06:06:59 compute-1 sudo[31950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:06:59 compute-1 python3.9[31952]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:06:59 compute-1 sudo[31950]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:00 compute-1 sudo[32102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kupscmwojwoyosipevdezvemqmoharhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396419.4873955-159-268364947635422/AnsiballZ_file.py'
Nov 29 06:07:00 compute-1 sudo[32102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:00 compute-1 python3.9[32104]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:07:00 compute-1 sudo[32102]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:01 compute-1 sudo[32254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vclwgjcsshasalpjfsynooibiieodaxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396420.8791122-183-233899878958317/AnsiballZ_stat.py'
Nov 29 06:07:01 compute-1 sudo[32254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:01 compute-1 python3.9[32256]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:07:01 compute-1 sudo[32254]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:02 compute-1 sudo[32379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkjkbbirjbfzhellsbudqpvwqayskgti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396420.8791122-183-233899878958317/AnsiballZ_copy.py'
Nov 29 06:07:02 compute-1 sudo[32379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:02 compute-1 python3.9[32381]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396420.8791122-183-233899878958317/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:07:02 compute-1 sudo[32379]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:02 compute-1 sshd-session[32257]: Received disconnect from 118.194.230.250 port 49138:11: Bye Bye [preauth]
Nov 29 06:07:02 compute-1 sshd-session[32257]: Disconnected from authenticating user root 118.194.230.250 port 49138 [preauth]
Nov 29 06:07:02 compute-1 irqbalance[781]: Cannot change IRQ 33 affinity: Operation not permitted
Nov 29 06:07:02 compute-1 irqbalance[781]: IRQ 33 affinity is now unmanaged
Nov 29 06:07:02 compute-1 sudo[32531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjvrpttwyannceuhhmmeyzpnnepdwsww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396422.4550283-228-2557415634986/AnsiballZ_setup.py'
Nov 29 06:07:02 compute-1 sudo[32531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:03 compute-1 python3.9[32533]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:07:03 compute-1 sudo[32531]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:03 compute-1 sudo[32687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avmcezhzeycwcqgfbgmzxcvfpjmrebzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396423.5971158-252-25305553963083/AnsiballZ_file.py'
Nov 29 06:07:03 compute-1 sudo[32687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:04 compute-1 python3.9[32689]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:07:04 compute-1 sudo[32687]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:04 compute-1 sudo[32839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukpzbdfcbdijufunziuwgswqqvawlzqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396424.5417392-279-196417078167559/AnsiballZ_file.py'
Nov 29 06:07:04 compute-1 sudo[32839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:05 compute-1 python3.9[32841]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:07:05 compute-1 sudo[32839]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:06 compute-1 python3.9[32991]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:07:10 compute-1 python3.9[33244]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:07:10 compute-1 python3.9[33394]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:07:12 compute-1 python3.9[33548]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:07:13 compute-1 sudo[33704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smnyprymmzgulwirjuiwzwcrngkjxxzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396432.79536-423-113767697903321/AnsiballZ_setup.py'
Nov 29 06:07:13 compute-1 sudo[33704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:13 compute-1 python3.9[33706]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:07:13 compute-1 sudo[33704]: pam_unix(sudo:session): session closed for user root
Nov 29 06:07:14 compute-1 sudo[33788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqywtwrdcpxreyxofrfmugqtyooydars ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396432.79536-423-113767697903321/AnsiballZ_dnf.py'
Nov 29 06:07:14 compute-1 sudo[33788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:07:14 compute-1 python3.9[33790]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:07:56 compute-1 sshd-session[33935]: Received disconnect from 45.55.249.98 port 59140:11: Bye Bye [preauth]
Nov 29 06:07:56 compute-1 sshd-session[33935]: Disconnected from authenticating user root 45.55.249.98 port 59140 [preauth]
Nov 29 06:08:00 compute-1 systemd[1]: Reloading.
Nov 29 06:08:00 compute-1 systemd-rc-local-generator[33992]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:08:00 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 29 06:08:00 compute-1 systemd[1]: Reloading.
Nov 29 06:08:00 compute-1 systemd-rc-local-generator[34029]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:08:00 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 29 06:08:00 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 29 06:08:01 compute-1 systemd[1]: Reloading.
Nov 29 06:08:01 compute-1 systemd-rc-local-generator[34074]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:08:01 compute-1 systemd[1]: Starting dnf makecache...
Nov 29 06:08:01 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 29 06:08:01 compute-1 dnf[34081]: Failed determining last makecache time.
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-openstack-barbican-42b4c41831408a8e323 150 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 184 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-openstack-cinder-1c00d6490d88e436f26ef 167 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-python-stevedore-c4acc5639fd2329372142 187 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-python-cloudkitty-tests-tempest-2c80f8 207 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-os-net-config-9758ab42364673d01bc5014e 204 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 194 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-python-designate-tests-tempest-347fdbc 182 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-openstack-glance-1fd12c29b339f30fe823e 161 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 179 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-openstack-manila-3c01b7181572c95dac462 196 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-python-whitebox-neutron-tests-tempest- 206 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-openstack-octavia-ba397f07a7331190208c 187 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-openstack-watcher-c014f81a8647287f6dcc 173 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-python-tcib-1124124ec06aadbac34f0d340b 195 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 176 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-openstack-swift-dc98a8463506ac520c469a 187 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-python-tempestconf-8515371b7cceebd4282 182 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dnf[34081]: delorean-openstack-heat-ui-013accbfd179753bc3f0 169 kB/s | 3.0 kB     00:00
Nov 29 06:08:01 compute-1 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 06:08:01 compute-1 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 06:08:01 compute-1 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 06:08:01 compute-1 dnf[34081]: CentOS Stream 9 - BaseOS                         49 kB/s | 7.3 kB     00:00
Nov 29 06:08:02 compute-1 dnf[34081]: CentOS Stream 9 - AppStream                      48 kB/s | 7.4 kB     00:00
Nov 29 06:08:02 compute-1 dnf[34081]: CentOS Stream 9 - CRB                            73 kB/s | 7.2 kB     00:00
Nov 29 06:08:02 compute-1 dnf[34081]: CentOS Stream 9 - Extras packages                29 kB/s | 8.3 kB     00:00
Nov 29 06:08:02 compute-1 dnf[34081]: dlrn-antelope-testing                           141 kB/s | 3.0 kB     00:00
Nov 29 06:08:02 compute-1 dnf[34081]: dlrn-antelope-build-deps                        175 kB/s | 3.0 kB     00:00
Nov 29 06:08:02 compute-1 dnf[34081]: centos9-rabbitmq                                133 kB/s | 3.0 kB     00:00
Nov 29 06:08:02 compute-1 dnf[34081]: centos9-storage                                 127 kB/s | 3.0 kB     00:00
Nov 29 06:08:02 compute-1 dnf[34081]: centos9-opstools                                129 kB/s | 3.0 kB     00:00
Nov 29 06:08:02 compute-1 dnf[34081]: NFV SIG OpenvSwitch                             131 kB/s | 3.0 kB     00:00
Nov 29 06:08:02 compute-1 dnf[34081]: repo-setup-centos-appstream                     183 kB/s | 4.4 kB     00:00
Nov 29 06:08:03 compute-1 dnf[34081]: repo-setup-centos-baseos                        151 kB/s | 3.9 kB     00:00
Nov 29 06:08:03 compute-1 dnf[34081]: repo-setup-centos-highavailability              168 kB/s | 3.9 kB     00:00
Nov 29 06:08:03 compute-1 dnf[34081]: repo-setup-centos-powertools                    175 kB/s | 4.3 kB     00:00
Nov 29 06:08:03 compute-1 dnf[34081]: Extra Packages for Enterprise Linux 9 - x86_64   88 kB/s |  33 kB     00:00
Nov 29 06:08:04 compute-1 dnf[34081]: Metadata cache created.
Nov 29 06:08:04 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 29 06:08:04 compute-1 systemd[1]: Finished dnf makecache.
Nov 29 06:08:04 compute-1 systemd[1]: dnf-makecache.service: Consumed 1.818s CPU time.
Nov 29 06:08:18 compute-1 sshd-session[34181]: Invalid user user1 from 118.194.230.250 port 49234
Nov 29 06:08:19 compute-1 sshd-session[34181]: Received disconnect from 118.194.230.250 port 49234:11: Bye Bye [preauth]
Nov 29 06:08:19 compute-1 sshd-session[34181]: Disconnected from invalid user user1 118.194.230.250 port 49234 [preauth]
Nov 29 06:08:32 compute-1 sshd-session[34236]: Invalid user ubadmin from 71.70.164.48 port 55008
Nov 29 06:08:32 compute-1 sshd-session[34236]: Received disconnect from 71.70.164.48 port 55008:11: Bye Bye [preauth]
Nov 29 06:08:32 compute-1 sshd-session[34236]: Disconnected from invalid user ubadmin 71.70.164.48 port 55008 [preauth]
Nov 29 06:09:13 compute-1 kernel: SELinux:  Converting 2718 SID table entries...
Nov 29 06:09:13 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:09:13 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:09:13 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:09:13 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:09:13 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:09:13 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:09:13 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:09:14 compute-1 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 29 06:09:14 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:09:14 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:09:14 compute-1 systemd[1]: Reloading.
Nov 29 06:09:14 compute-1 systemd-rc-local-generator[34447]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:09:14 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:09:15 compute-1 sudo[33788]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:16 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:09:16 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:09:16 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.572s CPU time.
Nov 29 06:09:16 compute-1 systemd[1]: run-r98930854f7fc4a1cabb4738f9b03ebf0.service: Deactivated successfully.
Nov 29 06:09:18 compute-1 sudo[35359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gatfsivsebrgofpefnpbnlggennwzdte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396556.1890347-460-238250690320854/AnsiballZ_command.py'
Nov 29 06:09:18 compute-1 sudo[35359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:18 compute-1 python3.9[35361]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:09:19 compute-1 sudo[35359]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:19 compute-1 sshd-session[35509]: Invalid user user10 from 45.55.249.98 port 53618
Nov 29 06:09:19 compute-1 sshd-session[35509]: Received disconnect from 45.55.249.98 port 53618:11: Bye Bye [preauth]
Nov 29 06:09:19 compute-1 sshd-session[35509]: Disconnected from invalid user user10 45.55.249.98 port 53618 [preauth]
Nov 29 06:09:20 compute-1 sudo[35642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpubiukrusdakumlnopgpumhqngsqkze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396559.9959495-484-258312160105501/AnsiballZ_selinux.py'
Nov 29 06:09:20 compute-1 sudo[35642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:21 compute-1 python3.9[35644]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 29 06:09:21 compute-1 sudo[35642]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:22 compute-1 sudo[35794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvzkojkwgxygafhgwwsmqudeynpjkqen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396562.2740562-516-111361584728707/AnsiballZ_command.py'
Nov 29 06:09:22 compute-1 sudo[35794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:22 compute-1 python3.9[35796]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 29 06:09:24 compute-1 sudo[35794]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:25 compute-1 sudo[35948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kltqeqgitgtzxxeobnetsdtwffulrulr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396565.1304715-540-270206793572964/AnsiballZ_file.py'
Nov 29 06:09:25 compute-1 sudo[35948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:26 compute-1 python3.9[35950]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:09:26 compute-1 sudo[35948]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:26 compute-1 sudo[36100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkpwdwnroaecrsvldwmicqemqjqwbvfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396566.3488126-564-137832904273358/AnsiballZ_mount.py'
Nov 29 06:09:26 compute-1 sudo[36100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:27 compute-1 python3.9[36102]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 29 06:09:27 compute-1 sudo[36100]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:29 compute-1 sudo[36252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcrxyfogsrxbvwosvjeemxlarjyacgre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396569.4514742-648-95588316424573/AnsiballZ_file.py'
Nov 29 06:09:29 compute-1 sudo[36252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:30 compute-1 python3.9[36254]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:09:30 compute-1 sudo[36252]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:32 compute-1 sudo[36404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cktbpvqrcmvjjaxrkzivmjbemyunskaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396572.4090178-672-115381945327346/AnsiballZ_stat.py'
Nov 29 06:09:32 compute-1 sudo[36404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:33 compute-1 python3.9[36406]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:09:33 compute-1 sudo[36404]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:33 compute-1 sudo[36527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzumgxrmvxyuhidiwdchgaoxyktatcir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396572.4090178-672-115381945327346/AnsiballZ_copy.py'
Nov 29 06:09:33 compute-1 sudo[36527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:37 compute-1 python3.9[36529]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396572.4090178-672-115381945327346/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:09:37 compute-1 sudo[36527]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:38 compute-1 sudo[36681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zajsrbdneeorgblrstpjyukqmsneaqmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396578.3219435-744-250195863097878/AnsiballZ_stat.py'
Nov 29 06:09:38 compute-1 sudo[36681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:38 compute-1 python3.9[36683]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:09:38 compute-1 sudo[36681]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:39 compute-1 sudo[36833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eilvjzgddhmunufilfcyaomisscclhba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396579.1085024-768-235253239316480/AnsiballZ_command.py'
Nov 29 06:09:39 compute-1 sudo[36833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:39 compute-1 sshd-session[36611]: Received disconnect from 118.194.230.250 port 49342:11: Bye Bye [preauth]
Nov 29 06:09:39 compute-1 sshd-session[36611]: Disconnected from authenticating user root 118.194.230.250 port 49342 [preauth]
Nov 29 06:09:39 compute-1 python3.9[36835]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:09:39 compute-1 sudo[36833]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:40 compute-1 sudo[36986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzsxztppjlbmurtdvmfidudfzfocivaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396580.097669-792-207136546795241/AnsiballZ_file.py'
Nov 29 06:09:40 compute-1 sudo[36986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:40 compute-1 python3.9[36988]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:09:40 compute-1 sudo[36986]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:41 compute-1 sudo[37138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucyenfujsdrzjbderqsrjrazdyoffgkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396581.2039952-825-43297752936123/AnsiballZ_getent.py'
Nov 29 06:09:41 compute-1 sudo[37138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:41 compute-1 python3.9[37140]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 29 06:09:42 compute-1 sudo[37138]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:42 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:09:42 compute-1 sudo[37292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqkdxcrojcbqmwboiqtgvvipahflblbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396582.263409-849-134005774152158/AnsiballZ_group.py'
Nov 29 06:09:42 compute-1 sudo[37292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:43 compute-1 python3.9[37294]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:09:43 compute-1 groupadd[37295]: group added to /etc/group: name=qemu, GID=107
Nov 29 06:09:43 compute-1 groupadd[37295]: group added to /etc/gshadow: name=qemu
Nov 29 06:09:43 compute-1 groupadd[37295]: new group: name=qemu, GID=107
Nov 29 06:09:43 compute-1 sudo[37292]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:44 compute-1 sudo[37450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cayiionejygdwlzbehxlkxnltwlbjfim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396583.4945774-873-268825960803871/AnsiballZ_user.py'
Nov 29 06:09:44 compute-1 sudo[37450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:44 compute-1 python3.9[37452]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:09:44 compute-1 useradd[37454]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Nov 29 06:09:44 compute-1 sudo[37450]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:45 compute-1 sudo[37610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmvwsnjlppieqljzygisitlizmloxjsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396584.974722-897-125665547593626/AnsiballZ_getent.py'
Nov 29 06:09:45 compute-1 sudo[37610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:45 compute-1 python3.9[37612]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 29 06:09:45 compute-1 sudo[37610]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:46 compute-1 sudo[37763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyqroeqovdahlxjllwjlltjbblpuyoqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396585.817354-921-12106150347372/AnsiballZ_group.py'
Nov 29 06:09:46 compute-1 sudo[37763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:46 compute-1 python3.9[37765]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:09:46 compute-1 groupadd[37766]: group added to /etc/group: name=hugetlbfs, GID=42477
Nov 29 06:09:46 compute-1 groupadd[37766]: group added to /etc/gshadow: name=hugetlbfs
Nov 29 06:09:46 compute-1 groupadd[37766]: new group: name=hugetlbfs, GID=42477
Nov 29 06:09:46 compute-1 sudo[37763]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:47 compute-1 sudo[37921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtqfwgahypieiztiihrkmykcfzaxuozg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396586.763854-948-276210540853809/AnsiballZ_file.py'
Nov 29 06:09:47 compute-1 sudo[37921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:47 compute-1 python3.9[37923]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 29 06:09:47 compute-1 sudo[37921]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:48 compute-1 sudo[38073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icqvrkzuezdhbaazomluvgqxygggvhca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396587.8487937-981-91650843043720/AnsiballZ_dnf.py'
Nov 29 06:09:48 compute-1 sudo[38073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:48 compute-1 python3.9[38075]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:09:50 compute-1 sudo[38073]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:51 compute-1 sudo[38226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwxmbvmardknnkxeaypwmlinotoidbfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396590.651534-1005-143498916687333/AnsiballZ_file.py'
Nov 29 06:09:51 compute-1 sudo[38226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:51 compute-1 python3.9[38228]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:09:51 compute-1 sudo[38226]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:51 compute-1 sudo[38378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqhximtuwkcshnwhljyxalmukbxddstx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396591.5119562-1029-78695063460409/AnsiballZ_stat.py'
Nov 29 06:09:51 compute-1 sudo[38378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:52 compute-1 python3.9[38380]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:09:52 compute-1 sudo[38378]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:52 compute-1 sudo[38501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmfxpjqwqkdvvlcvynwxslkjddupeeam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396591.5119562-1029-78695063460409/AnsiballZ_copy.py'
Nov 29 06:09:52 compute-1 sudo[38501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:52 compute-1 python3.9[38503]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396591.5119562-1029-78695063460409/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:09:52 compute-1 sudo[38501]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:53 compute-1 sudo[38653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thkyuefextxaovfehpujvlomlccrhpfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396593.0746758-1074-207703326422991/AnsiballZ_systemd.py'
Nov 29 06:09:53 compute-1 sudo[38653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:53 compute-1 python3.9[38655]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:09:54 compute-1 systemd[1]: Starting Load Kernel Modules...
Nov 29 06:09:54 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 29 06:09:54 compute-1 kernel: Bridge firewalling registered
Nov 29 06:09:54 compute-1 systemd-modules-load[38659]: Inserted module 'br_netfilter'
Nov 29 06:09:54 compute-1 systemd[1]: Finished Load Kernel Modules.
Nov 29 06:09:54 compute-1 sudo[38653]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:54 compute-1 sudo[38812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvpnhwxmnfodewncsirltpmzqbbfylkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396594.6318362-1098-171061386826049/AnsiballZ_stat.py'
Nov 29 06:09:54 compute-1 sudo[38812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:55 compute-1 python3.9[38814]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:09:55 compute-1 sudo[38812]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:55 compute-1 sudo[38935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nghjcexbyrzhaadchdvadoibjbrvqled ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396594.6318362-1098-171061386826049/AnsiballZ_copy.py'
Nov 29 06:09:55 compute-1 sudo[38935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:55 compute-1 python3.9[38937]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396594.6318362-1098-171061386826049/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:09:55 compute-1 sudo[38935]: pam_unix(sudo:session): session closed for user root
Nov 29 06:09:56 compute-1 sudo[39087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tittlkucdontfspvbvoleyglzbfpjahm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396596.3976972-1152-107354525734941/AnsiballZ_dnf.py'
Nov 29 06:09:56 compute-1 sudo[39087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:09:57 compute-1 python3.9[39089]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:10:02 compute-1 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 06:10:02 compute-1 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 06:10:02 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:10:02 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:10:02 compute-1 systemd[1]: Reloading.
Nov 29 06:10:03 compute-1 systemd-rc-local-generator[39152]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:10:03 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:10:03 compute-1 sudo[39087]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:04 compute-1 python3.9[40574]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:10:05 compute-1 python3.9[41503]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 29 06:10:06 compute-1 python3.9[42193]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:10:07 compute-1 sudo[43058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkcynvntqvkysvzzgkicuwduvmzrylet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396606.8601422-1269-116616791476895/AnsiballZ_command.py'
Nov 29 06:10:07 compute-1 sudo[43058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:07 compute-1 python3.9[43072]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:10:07 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 06:10:07 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:10:07 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:10:07 compute-1 systemd[1]: man-db-cache-update.service: Consumed 5.882s CPU time.
Nov 29 06:10:07 compute-1 systemd[1]: run-rb9bf4a3f2bde443fa32c2d2735b599e3.service: Deactivated successfully.
Nov 29 06:10:08 compute-1 systemd[1]: Starting Authorization Manager...
Nov 29 06:10:08 compute-1 polkitd[43499]: Started polkitd version 0.117
Nov 29 06:10:08 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 06:10:08 compute-1 polkitd[43499]: Loading rules from directory /etc/polkit-1/rules.d
Nov 29 06:10:08 compute-1 polkitd[43499]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 29 06:10:08 compute-1 polkitd[43499]: Finished loading, compiling and executing 2 rules
Nov 29 06:10:08 compute-1 polkitd[43499]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 29 06:10:08 compute-1 systemd[1]: Started Authorization Manager.
Nov 29 06:10:08 compute-1 sudo[43058]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:08 compute-1 sudo[43667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frisvxinzuuvukfbmdtonyceedykjsyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396608.6207705-1296-164479972313546/AnsiballZ_systemd.py'
Nov 29 06:10:08 compute-1 sudo[43667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:09 compute-1 python3.9[43669]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:10:09 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 29 06:10:09 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Nov 29 06:10:09 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 29 06:10:09 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 06:10:09 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 06:10:09 compute-1 sudo[43667]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:10 compute-1 python3.9[43831]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 29 06:10:14 compute-1 sudo[43981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqcwwhvfzrtxecmbyqvrkdxlevwppoxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396614.0105352-1467-119287722280850/AnsiballZ_systemd.py'
Nov 29 06:10:14 compute-1 sudo[43981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:14 compute-1 python3.9[43983]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:10:14 compute-1 systemd[1]: Reloading.
Nov 29 06:10:14 compute-1 systemd-rc-local-generator[44010]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:10:14 compute-1 sudo[43981]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:15 compute-1 sudo[44171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhdtmxhshezblzehqwvkmjurkrgioeiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396615.0642643-1467-59831230593671/AnsiballZ_systemd.py'
Nov 29 06:10:15 compute-1 sudo[44171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:15 compute-1 python3.9[44173]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:10:15 compute-1 systemd[1]: Reloading.
Nov 29 06:10:15 compute-1 systemd-rc-local-generator[44198]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:10:15 compute-1 sudo[44171]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:16 compute-1 sudo[44360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrivzzurvggpecljeusiaclzucvxpspp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396616.513832-1515-40899123484958/AnsiballZ_command.py'
Nov 29 06:10:16 compute-1 sudo[44360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:17 compute-1 python3.9[44362]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:10:17 compute-1 sudo[44360]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:17 compute-1 sudo[44513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfnmmqprnjcviysqvqwxrcjbtxhqjgez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396617.3724415-1539-20822497821339/AnsiballZ_command.py'
Nov 29 06:10:17 compute-1 sudo[44513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:17 compute-1 python3.9[44515]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:10:17 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 29 06:10:17 compute-1 sudo[44513]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:18 compute-1 sudo[44666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbkkybozkrfmnigyhqsmwlhhohdgjvvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396618.2517033-1563-3092512705289/AnsiballZ_command.py'
Nov 29 06:10:18 compute-1 sudo[44666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:18 compute-1 python3.9[44668]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:10:20 compute-1 sudo[44666]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:20 compute-1 sudo[44828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhvdjfmykqaggulrywfyngmeszvzxzwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396620.5655773-1587-206739061095666/AnsiballZ_command.py'
Nov 29 06:10:20 compute-1 sudo[44828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:21 compute-1 python3.9[44830]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:10:21 compute-1 sudo[44828]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:21 compute-1 sudo[44981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtzzxhgfkhzodcuixzizzeelwagbvzcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396621.523726-1611-130195748148572/AnsiballZ_systemd.py'
Nov 29 06:10:21 compute-1 sudo[44981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:22 compute-1 python3.9[44983]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:10:22 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 06:10:22 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Nov 29 06:10:22 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Nov 29 06:10:22 compute-1 systemd[1]: Starting Apply Kernel Variables...
Nov 29 06:10:22 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 06:10:22 compute-1 systemd[1]: Finished Apply Kernel Variables.
Nov 29 06:10:22 compute-1 sudo[44981]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:22 compute-1 sshd-session[31323]: Connection closed by 192.168.122.30 port 46280
Nov 29 06:10:22 compute-1 sshd-session[31320]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:10:22 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Nov 29 06:10:22 compute-1 systemd[1]: session-10.scope: Consumed 2min 28.892s CPU time.
Nov 29 06:10:22 compute-1 systemd-logind[785]: Session 10 logged out. Waiting for processes to exit.
Nov 29 06:10:22 compute-1 systemd-logind[785]: Removed session 10.
Nov 29 06:10:28 compute-1 sshd-session[45013]: Accepted publickey for zuul from 192.168.122.30 port 52326 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:10:28 compute-1 systemd-logind[785]: New session 11 of user zuul.
Nov 29 06:10:29 compute-1 systemd[1]: Started Session 11 of User zuul.
Nov 29 06:10:29 compute-1 sshd-session[45013]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:10:30 compute-1 python3.9[45166]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:10:31 compute-1 sudo[45320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnrdekcrnneewqidftmthzoqkkwfrgaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396631.264544-74-6394385188798/AnsiballZ_getent.py'
Nov 29 06:10:31 compute-1 sudo[45320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:31 compute-1 python3.9[45322]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 29 06:10:31 compute-1 sudo[45320]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:32 compute-1 sudo[45473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhbfqyyolqoimgosaixltvxfklpzhojp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396632.2197583-98-160199482449322/AnsiballZ_group.py'
Nov 29 06:10:32 compute-1 sudo[45473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:32 compute-1 python3.9[45475]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:10:33 compute-1 groupadd[45476]: group added to /etc/group: name=openvswitch, GID=42476
Nov 29 06:10:33 compute-1 groupadd[45476]: group added to /etc/gshadow: name=openvswitch
Nov 29 06:10:33 compute-1 groupadd[45476]: new group: name=openvswitch, GID=42476
Nov 29 06:10:33 compute-1 sudo[45473]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:34 compute-1 sudo[45631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aynwpurytzlubqrpatiysqglqltunzvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396633.6135347-122-255985988369797/AnsiballZ_user.py'
Nov 29 06:10:34 compute-1 sudo[45631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:34 compute-1 python3.9[45633]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:10:34 compute-1 useradd[45635]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Nov 29 06:10:34 compute-1 useradd[45635]: add 'openvswitch' to group 'hugetlbfs'
Nov 29 06:10:34 compute-1 useradd[45635]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 29 06:10:34 compute-1 sudo[45631]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:35 compute-1 sudo[45791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-binmshlccuoatusztnfsonqnprmgxdao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396635.0506856-152-111275928478974/AnsiballZ_setup.py'
Nov 29 06:10:35 compute-1 sudo[45791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:35 compute-1 python3.9[45793]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:10:35 compute-1 sudo[45791]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:36 compute-1 sudo[45875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkbsctnlizodcfwbizoxqymtpwzsmkqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396635.0506856-152-111275928478974/AnsiballZ_dnf.py'
Nov 29 06:10:36 compute-1 sudo[45875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:36 compute-1 python3.9[45877]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:10:38 compute-1 sshd-session[45879]: Invalid user controlm from 45.55.249.98 port 51506
Nov 29 06:10:38 compute-1 sshd-session[45879]: Received disconnect from 45.55.249.98 port 51506:11: Bye Bye [preauth]
Nov 29 06:10:38 compute-1 sshd-session[45879]: Disconnected from invalid user controlm 45.55.249.98 port 51506 [preauth]
Nov 29 06:10:39 compute-1 sudo[45875]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:40 compute-1 sudo[46041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yijnlxmbeqowegpakwsstmrcfgnuyfba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396640.3422585-194-212536405619551/AnsiballZ_dnf.py'
Nov 29 06:10:40 compute-1 sudo[46041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:40 compute-1 python3.9[46043]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:10:50 compute-1 sshd-session[46058]: Invalid user monitoring from 71.70.164.48 port 53543
Nov 29 06:10:50 compute-1 sshd-session[46058]: Received disconnect from 71.70.164.48 port 53543:11: Bye Bye [preauth]
Nov 29 06:10:50 compute-1 sshd-session[46058]: Disconnected from invalid user monitoring 71.70.164.48 port 53543 [preauth]
Nov 29 06:10:55 compute-1 kernel: SELinux:  Converting 2730 SID table entries...
Nov 29 06:10:55 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:10:55 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:10:55 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:10:55 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:10:55 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:10:55 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:10:55 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:10:55 compute-1 groupadd[46070]: group added to /etc/group: name=unbound, GID=993
Nov 29 06:10:55 compute-1 groupadd[46070]: group added to /etc/gshadow: name=unbound
Nov 29 06:10:55 compute-1 groupadd[46070]: new group: name=unbound, GID=993
Nov 29 06:10:55 compute-1 useradd[46077]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Nov 29 06:10:55 compute-1 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 29 06:10:55 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 29 06:10:55 compute-1 sshd-session[46064]: Received disconnect from 118.194.230.250 port 49444:11: Bye Bye [preauth]
Nov 29 06:10:55 compute-1 sshd-session[46064]: Disconnected from authenticating user root 118.194.230.250 port 49444 [preauth]
Nov 29 06:10:57 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:10:57 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:10:57 compute-1 systemd[1]: Reloading.
Nov 29 06:10:57 compute-1 systemd-rc-local-generator[46572]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:10:57 compute-1 systemd-sysv-generator[46576]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:10:57 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:10:58 compute-1 sudo[46041]: pam_unix(sudo:session): session closed for user root
Nov 29 06:10:58 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:10:58 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:10:58 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.024s CPU time.
Nov 29 06:10:58 compute-1 systemd[1]: run-r90c2b8d4a6594a5f9b46a5416e65883c.service: Deactivated successfully.
Nov 29 06:10:58 compute-1 sudo[47143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpgdpblykbozjrlnmkyaynjqnlvrxwak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396658.306328-218-15990544443077/AnsiballZ_systemd.py'
Nov 29 06:10:58 compute-1 sudo[47143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:10:59 compute-1 python3.9[47145]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:10:59 compute-1 systemd[1]: Reloading.
Nov 29 06:10:59 compute-1 systemd-rc-local-generator[47176]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:10:59 compute-1 systemd-sysv-generator[47180]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:10:59 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Nov 29 06:10:59 compute-1 chown[47187]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 29 06:10:59 compute-1 ovs-ctl[47192]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 29 06:10:59 compute-1 ovs-ctl[47192]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 29 06:10:59 compute-1 ovs-ctl[47192]: Starting ovsdb-server [  OK  ]
Nov 29 06:10:59 compute-1 ovs-vsctl[47241]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 29 06:11:00 compute-1 ovs-vsctl[47257]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2fa83236-07b6-4ff7-bb56-9f4f13bed719\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 29 06:11:00 compute-1 ovs-ctl[47192]: Configuring Open vSwitch system IDs [  OK  ]
Nov 29 06:11:00 compute-1 ovs-ctl[47192]: Enabling remote OVSDB managers [  OK  ]
Nov 29 06:11:00 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Nov 29 06:11:00 compute-1 ovs-vsctl[47266]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 29 06:11:00 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 29 06:11:00 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 29 06:11:00 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 29 06:11:00 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Nov 29 06:11:00 compute-1 ovs-ctl[47310]: Inserting openvswitch module [  OK  ]
Nov 29 06:11:00 compute-1 ovs-ctl[47279]: Starting ovs-vswitchd [  OK  ]
Nov 29 06:11:00 compute-1 ovs-vsctl[47327]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 29 06:11:00 compute-1 ovs-ctl[47279]: Enabling remote OVSDB managers [  OK  ]
Nov 29 06:11:00 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 29 06:11:00 compute-1 systemd[1]: Starting Open vSwitch...
Nov 29 06:11:00 compute-1 systemd[1]: Finished Open vSwitch.
Nov 29 06:11:00 compute-1 sudo[47143]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:01 compute-1 python3.9[47479]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:11:02 compute-1 sudo[47629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmxufzlahlvniyfjuifkopxbhozhfvwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396661.8209004-272-26751099052872/AnsiballZ_sefcontext.py'
Nov 29 06:11:02 compute-1 sudo[47629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:02 compute-1 python3.9[47631]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 29 06:11:03 compute-1 kernel: SELinux:  Converting 2744 SID table entries...
Nov 29 06:11:03 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:11:03 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:11:03 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:11:03 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:11:03 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:11:03 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:11:03 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:11:04 compute-1 sudo[47629]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:05 compute-1 python3.9[47786]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:11:05 compute-1 sudo[47942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijvxxejjsjfhuhoyqhjpeikolklpdvih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396665.5972078-326-215102929072027/AnsiballZ_dnf.py'
Nov 29 06:11:05 compute-1 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 29 06:11:05 compute-1 sudo[47942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:06 compute-1 python3.9[47944]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:11:07 compute-1 sudo[47942]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:08 compute-1 sudo[48095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wacbatqnifjbomkdubvfowltvpdpmaal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396667.8198042-350-74762782931196/AnsiballZ_command.py'
Nov 29 06:11:08 compute-1 sudo[48095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:08 compute-1 python3.9[48097]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:11:09 compute-1 sudo[48095]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:10 compute-1 sudo[48382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvezzcjfehoyptxiwpddstljxuasnagd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396669.5995731-374-80878535740725/AnsiballZ_file.py'
Nov 29 06:11:10 compute-1 sudo[48382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:10 compute-1 python3.9[48384]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 06:11:10 compute-1 sudo[48382]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:11 compute-1 python3.9[48534]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:11:11 compute-1 sudo[48686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wslouoloenvrhwvorstxxqvjufoxurby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396671.585963-422-274869281715776/AnsiballZ_dnf.py'
Nov 29 06:11:11 compute-1 sudo[48686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:12 compute-1 python3.9[48688]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:11:13 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:11:13 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:11:13 compute-1 systemd[1]: Reloading.
Nov 29 06:11:14 compute-1 systemd-sysv-generator[48729]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:11:14 compute-1 systemd-rc-local-generator[48726]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:11:14 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:11:14 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:11:14 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:11:14 compute-1 systemd[1]: run-r1d7b09ca33f74723a8f1f66caec391a5.service: Deactivated successfully.
Nov 29 06:11:14 compute-1 sudo[48686]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:15 compute-1 sudo[49004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsgnvsalxfhgefewgxfvckjvwwkyzfwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396675.0968442-446-127322880724377/AnsiballZ_systemd.py'
Nov 29 06:11:15 compute-1 sudo[49004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:15 compute-1 python3.9[49006]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:11:15 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 06:11:15 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Nov 29 06:11:15 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Nov 29 06:11:15 compute-1 systemd[1]: Stopping Network Manager...
Nov 29 06:11:15 compute-1 NetworkManager[7192]: <info>  [1764396675.7907] caught SIGTERM, shutting down normally.
Nov 29 06:11:15 compute-1 NetworkManager[7192]: <info>  [1764396675.7928] dhcp4 (eth0): canceled DHCP transaction
Nov 29 06:11:15 compute-1 NetworkManager[7192]: <info>  [1764396675.7929] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 06:11:15 compute-1 NetworkManager[7192]: <info>  [1764396675.7929] dhcp4 (eth0): state changed no lease
Nov 29 06:11:15 compute-1 NetworkManager[7192]: <info>  [1764396675.7931] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 06:11:15 compute-1 NetworkManager[7192]: <info>  [1764396675.8010] exiting (success)
Nov 29 06:11:15 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 06:11:15 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 06:11:15 compute-1 systemd[1]: Stopped Network Manager.
Nov 29 06:11:15 compute-1 systemd[1]: NetworkManager.service: Consumed 14.026s CPU time, 4.1M memory peak, read 0B from disk, written 37.5K to disk.
Nov 29 06:11:15 compute-1 systemd[1]: Starting Network Manager...
Nov 29 06:11:15 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.8686] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:ba0dbca2-f496-4536-953e-379bfe5fc9e9)
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.8687] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.8763] manager[0x56216546a090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 06:11:15 compute-1 systemd[1]: Starting Hostname Service...
Nov 29 06:11:15 compute-1 systemd[1]: Started Hostname Service.
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9769] hostname: hostname: using hostnamed
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9771] hostname: static hostname changed from (none) to "compute-1"
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9779] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9785] manager[0x56216546a090]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9786] manager[0x56216546a090]: rfkill: WWAN hardware radio set enabled
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9824] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9840] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9841] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9842] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9843] manager: Networking is enabled by state file
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9846] settings: Loaded settings plugin: keyfile (internal)
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9852] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9907] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9923] dhcp: init: Using DHCP client 'internal'
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9928] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9939] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9948] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9961] device (lo): Activation: starting connection 'lo' (e88f289b-57af-451c-b662-f7b3e0248e91)
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9972] device (eth0): carrier: link connected
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9981] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9989] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 06:11:15 compute-1 NetworkManager[49015]: <info>  [1764396675.9990] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0001] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0013] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0024] device (eth1): carrier: link connected
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0032] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0043] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (f58d442a-350a-5956-a954-8dae41cac9cb) (indicated)
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0044] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0051] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0063] device (eth1): Activation: starting connection 'ci-private-network' (f58d442a-350a-5956-a954-8dae41cac9cb)
Nov 29 06:11:16 compute-1 systemd[1]: Started Network Manager.
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0072] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0088] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0093] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0096] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0100] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0105] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0109] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0114] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0120] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0131] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0136] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0150] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0170] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0185] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0190] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0196] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0205] device (lo): Activation: successful, device activated.
Nov 29 06:11:16 compute-1 systemd[1]: Starting Network Manager Wait Online...
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0220] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0300] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0311] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0322] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0328] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0335] device (eth1): Activation: successful, device activated.
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0349] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0351] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0357] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0363] device (eth0): Activation: successful, device activated.
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0371] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 06:11:16 compute-1 NetworkManager[49015]: <info>  [1764396676.0414] manager: startup complete
Nov 29 06:11:16 compute-1 sudo[49004]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:16 compute-1 systemd[1]: Finished Network Manager Wait Online.
Nov 29 06:11:17 compute-1 sudo[49230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqjbaiqlrwtcxcqimtiyglppwwjwufwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396677.2256336-470-125640529978540/AnsiballZ_dnf.py'
Nov 29 06:11:17 compute-1 sudo[49230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:17 compute-1 python3.9[49232]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:11:23 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:11:24 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:11:24 compute-1 systemd[1]: Reloading.
Nov 29 06:11:24 compute-1 systemd-rc-local-generator[49282]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:11:24 compute-1 systemd-sysv-generator[49287]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:11:24 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:11:25 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:11:25 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:11:25 compute-1 systemd[1]: run-re19393818e514a5ca2b2f35d9632efbd.service: Deactivated successfully.
Nov 29 06:11:25 compute-1 sudo[49230]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:26 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 06:11:29 compute-1 sudo[49689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssgvzacxixxcbjerjmfutqvlmkuoeuek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396689.605473-506-245735890714276/AnsiballZ_stat.py'
Nov 29 06:11:29 compute-1 sudo[49689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:30 compute-1 python3.9[49691]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:11:30 compute-1 sudo[49689]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:30 compute-1 sudo[49841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqhryvqdkyezccakbwarrocwprrydoxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396690.4639313-533-84414807339335/AnsiballZ_ini_file.py'
Nov 29 06:11:30 compute-1 sudo[49841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:31 compute-1 python3.9[49843]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:31 compute-1 sudo[49841]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:31 compute-1 sudo[49995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ornagfrtxgyxbxaqadlfkggxkqlhsqaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396691.5411758-563-266996262886118/AnsiballZ_ini_file.py'
Nov 29 06:11:31 compute-1 sudo[49995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:32 compute-1 python3.9[49997]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:32 compute-1 sudo[49995]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:32 compute-1 sudo[50147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jigoiawsmqwbfhbnmxcexkyvqeltcsxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396692.3516119-563-4014397601962/AnsiballZ_ini_file.py'
Nov 29 06:11:32 compute-1 sudo[50147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:33 compute-1 python3.9[50149]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:33 compute-1 sudo[50147]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:33 compute-1 sudo[50299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmgvmnnitcvcjtnguricnqlwnxoztqna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396693.2988443-608-117598967385386/AnsiballZ_ini_file.py'
Nov 29 06:11:33 compute-1 sudo[50299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:33 compute-1 python3.9[50301]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:33 compute-1 sudo[50299]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:34 compute-1 sudo[50451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqfymyieigbfqykbbaqfnukyootmakth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396694.0573025-608-25030812126402/AnsiballZ_ini_file.py'
Nov 29 06:11:34 compute-1 sudo[50451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:34 compute-1 python3.9[50453]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:34 compute-1 sudo[50451]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:35 compute-1 sudo[50603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewgebbjdhevldxsxtgudxtqcpnpodqku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396694.7852724-653-19918081996793/AnsiballZ_stat.py'
Nov 29 06:11:35 compute-1 sudo[50603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:35 compute-1 python3.9[50605]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:11:35 compute-1 sudo[50603]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:35 compute-1 sudo[50726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goaahkpymfdplslcfmbquxnrcswsathi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396694.7852724-653-19918081996793/AnsiballZ_copy.py'
Nov 29 06:11:35 compute-1 sudo[50726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:36 compute-1 python3.9[50728]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396694.7852724-653-19918081996793/.source _original_basename=.i40v_3pn follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:36 compute-1 sudo[50726]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:36 compute-1 sudo[50878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqyfgpfnufwhomowimunpbnpbpdrbvtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396696.2520669-698-76796935053253/AnsiballZ_file.py'
Nov 29 06:11:36 compute-1 sudo[50878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:36 compute-1 python3.9[50880]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:36 compute-1 sudo[50878]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:37 compute-1 sudo[51030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boidxgpabbrljjvrhpbwurssoguabmrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396697.2589111-722-87111526886155/AnsiballZ_edpm_os_net_config_mappings.py'
Nov 29 06:11:37 compute-1 sudo[51030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:37 compute-1 python3.9[51032]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 29 06:11:37 compute-1 sudo[51030]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:38 compute-1 sudo[51182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rruuwzruhdhqrwoorwyhukpejxnoghqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396698.2514246-749-86055055966804/AnsiballZ_file.py'
Nov 29 06:11:38 compute-1 sudo[51182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:38 compute-1 python3.9[51184]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:38 compute-1 sudo[51182]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:39 compute-1 sudo[51334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyxbcibzsmcciatagrboixrechhquxua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396699.1858544-779-246728317929473/AnsiballZ_stat.py'
Nov 29 06:11:39 compute-1 sudo[51334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:39 compute-1 sudo[51334]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:40 compute-1 sudo[51457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnlzlwulnymaiznydyifnsszxscgzfxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396699.1858544-779-246728317929473/AnsiballZ_copy.py'
Nov 29 06:11:40 compute-1 sudo[51457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:40 compute-1 sudo[51457]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:41 compute-1 sudo[51609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltqruyjqjwttgyqghukcxycmhfkiyfyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396700.6102204-824-100282884932110/AnsiballZ_slurp.py'
Nov 29 06:11:41 compute-1 sudo[51609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:41 compute-1 python3.9[51611]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 29 06:11:41 compute-1 sudo[51609]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:42 compute-1 sudo[51784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwbbfxcpavukumnjktnjbutnwexgfjne ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396701.789081-851-279638518384644/async_wrapper.py j640183032971 300 /home/zuul/.ansible/tmp/ansible-tmp-1764396701.789081-851-279638518384644/AnsiballZ_edpm_os_net_config.py _'
Nov 29 06:11:42 compute-1 sudo[51784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:42 compute-1 ansible-async_wrapper.py[51786]: Invoked with j640183032971 300 /home/zuul/.ansible/tmp/ansible-tmp-1764396701.789081-851-279638518384644/AnsiballZ_edpm_os_net_config.py _
Nov 29 06:11:42 compute-1 ansible-async_wrapper.py[51789]: Starting module and watcher
Nov 29 06:11:42 compute-1 ansible-async_wrapper.py[51789]: Start watching 51790 (300)
Nov 29 06:11:42 compute-1 ansible-async_wrapper.py[51790]: Start module (51790)
Nov 29 06:11:42 compute-1 ansible-async_wrapper.py[51786]: Return async_wrapper task started.
Nov 29 06:11:42 compute-1 sudo[51784]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:42 compute-1 python3.9[51791]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 29 06:11:43 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 29 06:11:43 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 29 06:11:43 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 29 06:11:43 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 29 06:11:43 compute-1 kernel: cfg80211: failed to load regulatory.db
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.8463] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.8483] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.8964] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.8965] audit: op="connection-add" uuid="b952c8cb-7611-4778-be4c-bc06323a4506" name="br-ex-br" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.8989] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.8991] audit: op="connection-add" uuid="49f990b4-775c-4e02-bc38-e1a6aa226fe6" name="br-ex-port" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9009] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9010] audit: op="connection-add" uuid="d660df85-a2ac-4a29-8373-2f101d79d67f" name="eth1-port" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9027] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9028] audit: op="connection-add" uuid="41c18e8e-76c3-4e04-b760-435f81f44e36" name="vlan20-port" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9046] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9047] audit: op="connection-add" uuid="7a2c3747-3af2-4f7f-805f-3e293cbd8731" name="vlan21-port" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9063] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9064] audit: op="connection-add" uuid="4ab8e7b1-0190-4e16-a357-42a51c21fc46" name="vlan22-port" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9079] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9080] audit: op="connection-add" uuid="f976a7c4-f8bd-4537-9323-2d66bbc42bb6" name="vlan23-port" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9105] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9125] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9126] audit: op="connection-add" uuid="e861a275-5e1d-46ad-8a25-d41c1b167ab9" name="br-ex-if" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9186] audit: op="connection-update" uuid="f58d442a-350a-5956-a954-8dae41cac9cb" name="ci-private-network" args="ipv6.dns,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routing-rules,ipv6.routes,ipv6.method,ovs-external-ids.data,ipv4.dns,ipv4.addresses,ipv4.routes,ipv4.routing-rules,ipv4.method,ipv4.never-default,ovs-interface.type,connection.slave-type,connection.master,connection.controller,connection.port-type,connection.timestamp" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9206] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9207] audit: op="connection-add" uuid="f4fde273-6e7e-4204-b63a-acc23e947b61" name="vlan20-if" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9226] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9227] audit: op="connection-add" uuid="789f8a59-33d5-4ba4-9524-ea4d771fc63e" name="vlan21-if" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9247] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9249] audit: op="connection-add" uuid="76c2f154-19b3-46e5-a04d-a31e660b0888" name="vlan22-if" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9270] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9271] audit: op="connection-add" uuid="651303cc-960e-4019-b96b-2d109f1db40f" name="vlan23-if" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9288] audit: op="connection-delete" uuid="869e6d79-5f7b-3b9f-b76e-078155d890a2" name="Wired connection 1" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9302] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9313] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9318] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (b952c8cb-7611-4778-be4c-bc06323a4506)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9319] audit: op="connection-activate" uuid="b952c8cb-7611-4778-be4c-bc06323a4506" name="br-ex-br" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9320] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9328] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9332] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (49f990b4-775c-4e02-bc38-e1a6aa226fe6)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9333] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9339] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9344] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (d660df85-a2ac-4a29-8373-2f101d79d67f)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9348] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9354] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9358] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (41c18e8e-76c3-4e04-b760-435f81f44e36)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9360] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9367] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9371] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (7a2c3747-3af2-4f7f-805f-3e293cbd8731)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9372] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9378] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9383] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (4ab8e7b1-0190-4e16-a357-42a51c21fc46)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9384] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9391] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9395] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (f976a7c4-f8bd-4537-9323-2d66bbc42bb6)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9396] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9398] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9399] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9407] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9411] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9415] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (e861a275-5e1d-46ad-8a25-d41c1b167ab9)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9416] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9419] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9420] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9421] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9422] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9431] device (eth1): disconnecting for new activation request.
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9431] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9433] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9435] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9435] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9438] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9441] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9444] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (f4fde273-6e7e-4204-b63a-acc23e947b61)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9444] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9446] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9473] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9475] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9479] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9486] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9491] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (789f8a59-33d5-4ba4-9524-ea4d771fc63e)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9493] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9497] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9499] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9501] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9505] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9511] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9517] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (76c2f154-19b3-46e5-a04d-a31e660b0888)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9518] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9521] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9524] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9526] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9530] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9536] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9543] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (651303cc-960e-4019-b96b-2d109f1db40f)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9544] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9548] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9550] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9552] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9554] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9570] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9573] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9577] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9580] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9588] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9593] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9598] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9603] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9605] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9612] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9617] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9622] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9625] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9631] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9636] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9641] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9644] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9650] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9656] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9660] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9663] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9669] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9675] dhcp4 (eth0): canceled DHCP transaction
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9675] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9675] dhcp4 (eth0): state changed no lease
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9677] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9690] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51792 uid=0 result="fail" reason="Device is not activated"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9731] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9768] device (eth1): disconnecting for new activation request.
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9768] audit: op="connection-activate" uuid="f58d442a-350a-5956-a954-8dae41cac9cb" name="ci-private-network" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9773] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9781] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 29 06:11:44 compute-1 kernel: ovs-system: entered promiscuous mode
Nov 29 06:11:44 compute-1 kernel: Timeout policy base is empty
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9809] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9818] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51792 uid=0 result="success"
Nov 29 06:11:44 compute-1 systemd-udevd[51797]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9836] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 29 06:11:44 compute-1 NetworkManager[49015]: <info>  [1764396704.9856] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Nov 29 06:11:44 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 06:11:45 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0068] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0214] device (eth1): Activation: starting connection 'ci-private-network' (f58d442a-350a-5956-a954-8dae41cac9cb)
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0232] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0261] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0274] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 kernel: br-ex: entered promiscuous mode
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0303] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0312] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0320] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0321] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0323] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0325] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0327] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0329] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0349] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0357] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0362] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0366] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0373] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0382] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0388] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0394] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0400] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0406] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0411] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0427] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0431] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0438] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0447] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0460] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:11:45 compute-1 kernel: vlan22: entered promiscuous mode
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0487] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0502] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0504] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0509] device (eth1): Activation: successful, device activated.
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0555] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0560] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0569] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:11:45 compute-1 kernel: vlan23: entered promiscuous mode
Nov 29 06:11:45 compute-1 systemd-udevd[51798]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0631] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0654] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0698] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0700] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0710] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:11:45 compute-1 kernel: vlan20: entered promiscuous mode
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.0977] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1005] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1039] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1041] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1052] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1070] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1094] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 kernel: vlan21: entered promiscuous mode
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1121] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1123] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1127] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:11:45 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1245] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1267] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1300] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1302] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:11:45 compute-1 NetworkManager[49015]: <info>  [1764396705.1311] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:11:46 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 06:11:46 compute-1 NetworkManager[49015]: <info>  [1764396706.2161] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51792 uid=0 result="success"
Nov 29 06:11:46 compute-1 sudo[52149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klzualgjkcdxjdshlabqzaqklsdjzlay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396705.9257898-851-184829525909386/AnsiballZ_async_status.py'
Nov 29 06:11:46 compute-1 sudo[52149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:46 compute-1 NetworkManager[49015]: <info>  [1764396706.4124] checkpoint[0x562165440950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 29 06:11:46 compute-1 NetworkManager[49015]: <info>  [1764396706.4127] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51792 uid=0 result="success"
Nov 29 06:11:46 compute-1 python3.9[52151]: ansible-ansible.legacy.async_status Invoked with jid=j640183032971.51786 mode=status _async_dir=/root/.ansible_async
Nov 29 06:11:46 compute-1 sudo[52149]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:46 compute-1 NetworkManager[49015]: <info>  [1764396706.7778] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51792 uid=0 result="success"
Nov 29 06:11:46 compute-1 NetworkManager[49015]: <info>  [1764396706.7797] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51792 uid=0 result="success"
Nov 29 06:11:47 compute-1 NetworkManager[49015]: <info>  [1764396707.0845] audit: op="networking-control" arg="global-dns-configuration" pid=51792 uid=0 result="success"
Nov 29 06:11:47 compute-1 NetworkManager[49015]: <info>  [1764396707.0899] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 29 06:11:47 compute-1 NetworkManager[49015]: <info>  [1764396707.0947] audit: op="networking-control" arg="global-dns-configuration" pid=51792 uid=0 result="success"
Nov 29 06:11:47 compute-1 NetworkManager[49015]: <info>  [1764396707.0987] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51792 uid=0 result="success"
Nov 29 06:11:47 compute-1 NetworkManager[49015]: <info>  [1764396707.3567] checkpoint[0x562165440a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 29 06:11:47 compute-1 NetworkManager[49015]: <info>  [1764396707.3572] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51792 uid=0 result="success"
Nov 29 06:11:47 compute-1 ansible-async_wrapper.py[51790]: Module complete (51790)
Nov 29 06:11:47 compute-1 ansible-async_wrapper.py[51789]: Done in kid B.
Nov 29 06:11:49 compute-1 sudo[52255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppfmosjeeuxmhljhwnpqmtdidycityka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396705.9257898-851-184829525909386/AnsiballZ_async_status.py'
Nov 29 06:11:49 compute-1 sudo[52255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:50 compute-1 python3.9[52257]: ansible-ansible.legacy.async_status Invoked with jid=j640183032971.51786 mode=status _async_dir=/root/.ansible_async
Nov 29 06:11:50 compute-1 sudo[52255]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:50 compute-1 sudo[52355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpowhrjppywbcdkcsrhnjlrlocnhdjhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396705.9257898-851-184829525909386/AnsiballZ_async_status.py'
Nov 29 06:11:50 compute-1 sudo[52355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:50 compute-1 python3.9[52357]: ansible-ansible.legacy.async_status Invoked with jid=j640183032971.51786 mode=cleanup _async_dir=/root/.ansible_async
Nov 29 06:11:50 compute-1 sudo[52355]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:51 compute-1 sudo[52507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woapnniwidwvbiqkitsuojuwdrhxserw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396711.0557082-932-210303002402204/AnsiballZ_stat.py'
Nov 29 06:11:51 compute-1 sudo[52507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:51 compute-1 python3.9[52509]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:11:51 compute-1 sudo[52507]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:52 compute-1 sudo[52630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-purnsxwsudfiblvbckcnsiasygrexklt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396711.0557082-932-210303002402204/AnsiballZ_copy.py'
Nov 29 06:11:52 compute-1 sudo[52630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:52 compute-1 python3.9[52632]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396711.0557082-932-210303002402204/.source.returncode _original_basename=.t0pbo1y7 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:52 compute-1 sudo[52630]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:52 compute-1 sudo[52782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smwkqltpcgdoqrlofknuwqqdvhjaoztq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396712.5962775-980-47150206159258/AnsiballZ_stat.py'
Nov 29 06:11:52 compute-1 sudo[52782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:53 compute-1 python3.9[52784]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:11:53 compute-1 sudo[52782]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:53 compute-1 sudo[52906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpelwxcijhqnttsreztkhpmrhwdtqzsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396712.5962775-980-47150206159258/AnsiballZ_copy.py'
Nov 29 06:11:53 compute-1 sudo[52906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:53 compute-1 python3.9[52908]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396712.5962775-980-47150206159258/.source.cfg _original_basename=.f60mxffo follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:11:53 compute-1 sudo[52906]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:54 compute-1 sudo[53058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngedvlamvtjlwulctsgjotnrkxsulqea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396714.2859979-1025-32352748538938/AnsiballZ_systemd.py'
Nov 29 06:11:54 compute-1 sudo[53058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:11:54 compute-1 python3.9[53060]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:11:55 compute-1 systemd[1]: Reloading Network Manager...
Nov 29 06:11:55 compute-1 NetworkManager[49015]: <info>  [1764396715.0571] audit: op="reload" arg="0" pid=53064 uid=0 result="success"
Nov 29 06:11:55 compute-1 NetworkManager[49015]: <info>  [1764396715.0582] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 29 06:11:55 compute-1 systemd[1]: Reloaded Network Manager.
Nov 29 06:11:55 compute-1 sudo[53058]: pam_unix(sudo:session): session closed for user root
Nov 29 06:11:55 compute-1 sshd-session[53093]: Received disconnect from 45.55.249.98 port 49908:11: Bye Bye [preauth]
Nov 29 06:11:55 compute-1 sshd-session[53093]: Disconnected from authenticating user root 45.55.249.98 port 49908 [preauth]
Nov 29 06:11:55 compute-1 sshd-session[45016]: Connection closed by 192.168.122.30 port 52326
Nov 29 06:11:55 compute-1 sshd-session[45013]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:11:55 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Nov 29 06:11:55 compute-1 systemd[1]: session-11.scope: Consumed 59.132s CPU time.
Nov 29 06:11:55 compute-1 systemd-logind[785]: Session 11 logged out. Waiting for processes to exit.
Nov 29 06:11:55 compute-1 systemd-logind[785]: Removed session 11.
Nov 29 06:12:00 compute-1 sshd-session[53097]: Accepted publickey for zuul from 192.168.122.30 port 56712 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:12:00 compute-1 systemd-logind[785]: New session 12 of user zuul.
Nov 29 06:12:00 compute-1 systemd[1]: Started Session 12 of User zuul.
Nov 29 06:12:00 compute-1 sshd-session[53097]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:12:01 compute-1 python3.9[53250]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:12:02 compute-1 python3.9[53404]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:12:05 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 06:12:05 compute-1 python3.9[53598]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:12:06 compute-1 sshd-session[53100]: Connection closed by 192.168.122.30 port 56712
Nov 29 06:12:06 compute-1 sshd-session[53097]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:12:06 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Nov 29 06:12:06 compute-1 systemd[1]: session-12.scope: Consumed 2.652s CPU time.
Nov 29 06:12:06 compute-1 systemd-logind[785]: Session 12 logged out. Waiting for processes to exit.
Nov 29 06:12:06 compute-1 systemd-logind[785]: Removed session 12.
Nov 29 06:12:10 compute-1 sshd-session[53628]: Invalid user student from 118.194.230.250 port 49548
Nov 29 06:12:10 compute-1 sshd-session[53628]: Received disconnect from 118.194.230.250 port 49548:11: Bye Bye [preauth]
Nov 29 06:12:10 compute-1 sshd-session[53628]: Disconnected from invalid user student 118.194.230.250 port 49548 [preauth]
Nov 29 06:12:11 compute-1 sshd-session[53630]: Accepted publickey for zuul from 192.168.122.30 port 55832 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:12:11 compute-1 systemd-logind[785]: New session 13 of user zuul.
Nov 29 06:12:11 compute-1 systemd[1]: Started Session 13 of User zuul.
Nov 29 06:12:11 compute-1 sshd-session[53630]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:12:13 compute-1 python3.9[53784]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:12:14 compute-1 python3.9[53938]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:12:15 compute-1 sudo[54092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfnqhnkjhyzakuusllfxqkdncsrkfnhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396734.8731163-86-153113693744403/AnsiballZ_setup.py'
Nov 29 06:12:15 compute-1 sudo[54092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:15 compute-1 python3.9[54094]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:12:15 compute-1 sudo[54092]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:16 compute-1 sudo[54177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnzoqlvajotycwwtlxmvcbulrsdvfour ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396734.8731163-86-153113693744403/AnsiballZ_dnf.py'
Nov 29 06:12:16 compute-1 sudo[54177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:16 compute-1 python3.9[54179]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:12:17 compute-1 sudo[54177]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:18 compute-1 sudo[54330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcutshnhnoncixcrdvwzbapkbylcdiyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396738.2324226-122-33273655393452/AnsiballZ_setup.py'
Nov 29 06:12:18 compute-1 sudo[54330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:18 compute-1 python3.9[54332]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:12:20 compute-1 sudo[54330]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:21 compute-1 sudo[54525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnlebjymdabqqqseagwjfwknkdcycmgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396740.5985765-155-34979111561972/AnsiballZ_file.py'
Nov 29 06:12:21 compute-1 sudo[54525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:21 compute-1 python3.9[54527]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:12:21 compute-1 sudo[54525]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:21 compute-1 sudo[54677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sshvhhaprkjpbeyefgytezaesmudmtzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396741.4780693-179-72933677074182/AnsiballZ_command.py'
Nov 29 06:12:21 compute-1 sudo[54677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:22 compute-1 python3.9[54679]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:12:22 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat3573815175-merged.mount: Deactivated successfully.
Nov 29 06:12:22 compute-1 podman[54680]: 2025-11-29 06:12:22.192239362 +0000 UTC m=+0.058904675 system refresh
Nov 29 06:12:22 compute-1 sudo[54677]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:23 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:12:24 compute-1 sudo[54841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zghdtfypwhaapihvqyvviypjlqtggjpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396744.131724-203-102553718786577/AnsiballZ_stat.py'
Nov 29 06:12:24 compute-1 sudo[54841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:24 compute-1 python3.9[54843]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:12:24 compute-1 sudo[54841]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:25 compute-1 sudo[54964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cugbdspxfusmupusluefvhiseqljayje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396744.131724-203-102553718786577/AnsiballZ_copy.py'
Nov 29 06:12:25 compute-1 sudo[54964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:25 compute-1 python3.9[54966]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396744.131724-203-102553718786577/.source.json follow=False _original_basename=podman_network_config.j2 checksum=22f94c64376a85d67765fd46a234a717ce2c216b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:12:25 compute-1 sudo[54964]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:26 compute-1 sudo[55116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlkiykoupkwkrswpuecinhwtvqysthgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396745.7698796-248-176156533813758/AnsiballZ_stat.py'
Nov 29 06:12:26 compute-1 sudo[55116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:26 compute-1 python3.9[55118]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:12:26 compute-1 sudo[55116]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:27 compute-1 sudo[55239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkuilnbplivvozfjehdvjudtnbrqawog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396745.7698796-248-176156533813758/AnsiballZ_copy.py'
Nov 29 06:12:27 compute-1 sudo[55239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:27 compute-1 python3.9[55241]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396745.7698796-248-176156533813758/.source.conf follow=False _original_basename=registries.conf.j2 checksum=25aa6c560e50dcbd81b989ea46a7865cb55b8998 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:12:27 compute-1 sudo[55239]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:27 compute-1 sudo[55391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sztkufegmwhrmwdstqfymetmlvxqihva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396747.5073054-296-81083424528017/AnsiballZ_ini_file.py'
Nov 29 06:12:27 compute-1 sudo[55391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:28 compute-1 python3.9[55393]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:12:28 compute-1 sudo[55391]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:28 compute-1 sudo[55543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfcvooxfweinnohmuipburupcgwaiyas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396748.3640482-296-251557107208931/AnsiballZ_ini_file.py'
Nov 29 06:12:28 compute-1 sudo[55543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:28 compute-1 python3.9[55545]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:12:28 compute-1 sudo[55543]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:29 compute-1 sudo[55695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrersgnactwsbppxyzphkrbmdnllmxxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396749.007291-296-280592012029456/AnsiballZ_ini_file.py'
Nov 29 06:12:29 compute-1 sudo[55695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:29 compute-1 python3.9[55697]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:12:29 compute-1 sudo[55695]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:29 compute-1 sudo[55847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpweqnrdyrlyrcahdvkdpkqswigtkhbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396749.690413-296-201803187846046/AnsiballZ_ini_file.py'
Nov 29 06:12:29 compute-1 sudo[55847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:30 compute-1 python3.9[55849]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:12:30 compute-1 sudo[55847]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:31 compute-1 sudo[55999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgtmtfaxxhutrsljlqbixcialfzuhdyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396750.856822-389-4028804207478/AnsiballZ_dnf.py'
Nov 29 06:12:31 compute-1 sudo[55999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:31 compute-1 python3.9[56001]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:12:32 compute-1 sudo[55999]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:33 compute-1 sudo[56152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhlgjexhtlrszvvwniqvsiktkmlakglz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396753.2573721-422-86254639423583/AnsiballZ_setup.py'
Nov 29 06:12:33 compute-1 sudo[56152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:33 compute-1 python3.9[56154]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:12:33 compute-1 sudo[56152]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:34 compute-1 sudo[56306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuqxuxctpgpzffydluyaqgecezxynjri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396754.1894011-446-42884141834666/AnsiballZ_stat.py'
Nov 29 06:12:34 compute-1 sudo[56306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:34 compute-1 python3.9[56308]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:12:34 compute-1 sudo[56306]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:35 compute-1 sudo[56458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zslxidhungbbsxnxfhrqwwcfplssaoiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396755.329857-473-269846689931229/AnsiballZ_stat.py'
Nov 29 06:12:35 compute-1 sudo[56458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:35 compute-1 python3.9[56460]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:12:35 compute-1 sudo[56458]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:36 compute-1 sudo[56610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kafwvrqhxvopemtojmzqperqpoauqcgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396756.3243225-503-64221052604724/AnsiballZ_command.py'
Nov 29 06:12:36 compute-1 sudo[56610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:36 compute-1 python3.9[56612]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:12:36 compute-1 sudo[56610]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:37 compute-1 sudo[56763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugheyiqufwjspfeofmruibgxlfkxkdsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396757.327034-533-110921948732136/AnsiballZ_service_facts.py'
Nov 29 06:12:37 compute-1 sudo[56763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:38 compute-1 python3.9[56765]: ansible-service_facts Invoked
Nov 29 06:12:38 compute-1 network[56782]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:12:38 compute-1 network[56783]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:12:38 compute-1 network[56784]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:12:41 compute-1 sudo[56763]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:43 compute-1 sudo[57067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jymhuoctmdzouomihyyygcywftqucpvh ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764396763.5001948-578-237652252517849/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764396763.5001948-578-237652252517849/args'
Nov 29 06:12:43 compute-1 sudo[57067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:43 compute-1 sudo[57067]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:44 compute-1 sudo[57234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihehtvrfdploevnodhymqfnnhfwndixa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396764.4035487-611-225159979163172/AnsiballZ_dnf.py'
Nov 29 06:12:44 compute-1 sudo[57234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:44 compute-1 python3.9[57236]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:12:47 compute-1 sudo[57234]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:48 compute-1 sudo[57387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhtjqnwehhbwcbiebodiazkakxmuzqqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396767.8616815-650-251923512487059/AnsiballZ_package_facts.py'
Nov 29 06:12:48 compute-1 sudo[57387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:48 compute-1 python3.9[57389]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 29 06:12:49 compute-1 sudo[57387]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:50 compute-1 sudo[57539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqvzhokmriuplylzniiutzlowmfomzgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396769.8162096-680-95232814609151/AnsiballZ_stat.py'
Nov 29 06:12:50 compute-1 sudo[57539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:50 compute-1 python3.9[57541]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:12:50 compute-1 sudo[57539]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:50 compute-1 sudo[57664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fonmexyjuytkfcyabjwgwtwvtsijvudy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396769.8162096-680-95232814609151/AnsiballZ_copy.py'
Nov 29 06:12:50 compute-1 sudo[57664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:51 compute-1 python3.9[57666]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396769.8162096-680-95232814609151/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:12:51 compute-1 sudo[57664]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:51 compute-1 sudo[57818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffbsulqsqflawfyjprynqrkqvpqwqgna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396771.4413092-726-163996507871881/AnsiballZ_stat.py'
Nov 29 06:12:51 compute-1 sudo[57818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:51 compute-1 python3.9[57820]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:12:52 compute-1 sudo[57818]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:52 compute-1 sudo[57943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrvrtgtppjeejflposabnwcjgzsqkhws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396771.4413092-726-163996507871881/AnsiballZ_copy.py'
Nov 29 06:12:52 compute-1 sudo[57943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:52 compute-1 python3.9[57945]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396771.4413092-726-163996507871881/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:12:52 compute-1 sudo[57943]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:54 compute-1 sudo[58097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrctftozluzdjcjiynlxunrfwcnyczpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396773.7565565-789-40460143148045/AnsiballZ_lineinfile.py'
Nov 29 06:12:54 compute-1 sudo[58097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:54 compute-1 python3.9[58099]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:12:54 compute-1 sudo[58097]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:56 compute-1 sudo[58251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmfplusotxqvoslkjcftnlqsgpvtoypg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396775.6353574-834-214119907482081/AnsiballZ_setup.py'
Nov 29 06:12:56 compute-1 sudo[58251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:56 compute-1 python3.9[58253]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:12:56 compute-1 sudo[58251]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:57 compute-1 sudo[58335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvngknudbulzsboopcmbshabbymtgovl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396775.6353574-834-214119907482081/AnsiballZ_systemd.py'
Nov 29 06:12:57 compute-1 sudo[58335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:57 compute-1 python3.9[58337]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:12:57 compute-1 sudo[58335]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:58 compute-1 sudo[58489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiowlmluztutmughvlkshxhrfghqkuty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396778.4517903-881-135160099155317/AnsiballZ_setup.py'
Nov 29 06:12:58 compute-1 sudo[58489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:59 compute-1 python3.9[58491]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:12:59 compute-1 sudo[58489]: pam_unix(sudo:session): session closed for user root
Nov 29 06:12:59 compute-1 sudo[58573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpkuzeefwvogedxvxyardfckuszasiqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396778.4517903-881-135160099155317/AnsiballZ_systemd.py'
Nov 29 06:12:59 compute-1 sudo[58573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:12:59 compute-1 python3.9[58575]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:13:00 compute-1 systemd[1]: Stopping NTP client/server...
Nov 29 06:13:00 compute-1 chronyd[792]: chronyd exiting
Nov 29 06:13:00 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Nov 29 06:13:00 compute-1 systemd[1]: Stopped NTP client/server.
Nov 29 06:13:00 compute-1 systemd[1]: Starting NTP client/server...
Nov 29 06:13:00 compute-1 chronyd[58583]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 06:13:00 compute-1 chronyd[58583]: Frequency -26.426 +/- 0.401 ppm read from /var/lib/chrony/drift
Nov 29 06:13:00 compute-1 chronyd[58583]: Loaded seccomp filter (level 2)
Nov 29 06:13:00 compute-1 systemd[1]: Started NTP client/server.
Nov 29 06:13:00 compute-1 sudo[58573]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:00 compute-1 sshd-session[53633]: Connection closed by 192.168.122.30 port 55832
Nov 29 06:13:00 compute-1 sshd-session[53630]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:13:00 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Nov 29 06:13:00 compute-1 systemd[1]: session-13.scope: Consumed 30.578s CPU time.
Nov 29 06:13:00 compute-1 systemd-logind[785]: Session 13 logged out. Waiting for processes to exit.
Nov 29 06:13:00 compute-1 systemd-logind[785]: Removed session 13.
Nov 29 06:13:06 compute-1 sshd-session[58609]: Accepted publickey for zuul from 192.168.122.30 port 43182 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:13:06 compute-1 systemd-logind[785]: New session 14 of user zuul.
Nov 29 06:13:06 compute-1 systemd[1]: Started Session 14 of User zuul.
Nov 29 06:13:06 compute-1 sshd-session[58609]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:13:06 compute-1 sudo[58762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gohgjatodqdaxaadunoofetvnxzqmtix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396786.3304608-33-235843256096676/AnsiballZ_file.py'
Nov 29 06:13:06 compute-1 sudo[58762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:07 compute-1 python3.9[58764]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:07 compute-1 sudo[58762]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:07 compute-1 sshd-session[58788]: Invalid user gits from 71.70.164.48 port 52173
Nov 29 06:13:07 compute-1 sshd-session[58788]: Received disconnect from 71.70.164.48 port 52173:11: Bye Bye [preauth]
Nov 29 06:13:07 compute-1 sshd-session[58788]: Disconnected from invalid user gits 71.70.164.48 port 52173 [preauth]
Nov 29 06:13:07 compute-1 sudo[58916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywidowhigchesewfzrlboocjxdzoiwww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396787.3173635-69-200403109098984/AnsiballZ_stat.py'
Nov 29 06:13:07 compute-1 sudo[58916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:07 compute-1 python3.9[58918]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:07 compute-1 sudo[58916]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:08 compute-1 sudo[59039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yymljahlarsfcetqfinzeyhfuisvsukp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396787.3173635-69-200403109098984/AnsiballZ_copy.py'
Nov 29 06:13:08 compute-1 sudo[59039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:08 compute-1 python3.9[59041]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396787.3173635-69-200403109098984/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:08 compute-1 sudo[59039]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:09 compute-1 sshd-session[58612]: Connection closed by 192.168.122.30 port 43182
Nov 29 06:13:09 compute-1 sshd-session[58609]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:13:09 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Nov 29 06:13:09 compute-1 systemd[1]: session-14.scope: Consumed 1.834s CPU time.
Nov 29 06:13:09 compute-1 systemd-logind[785]: Session 14 logged out. Waiting for processes to exit.
Nov 29 06:13:09 compute-1 systemd-logind[785]: Removed session 14.
Nov 29 06:13:11 compute-1 sshd-session[59066]: Invalid user sistemas from 45.55.249.98 port 45110
Nov 29 06:13:11 compute-1 sshd-session[59066]: Received disconnect from 45.55.249.98 port 45110:11: Bye Bye [preauth]
Nov 29 06:13:11 compute-1 sshd-session[59066]: Disconnected from invalid user sistemas 45.55.249.98 port 45110 [preauth]
Nov 29 06:13:14 compute-1 sshd-session[59068]: Accepted publickey for zuul from 192.168.122.30 port 43192 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:13:14 compute-1 systemd-logind[785]: New session 15 of user zuul.
Nov 29 06:13:14 compute-1 systemd[1]: Started Session 15 of User zuul.
Nov 29 06:13:14 compute-1 sshd-session[59068]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:13:15 compute-1 python3.9[59221]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:13:16 compute-1 sudo[59375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgnwmgyhbxlvvribsuoncedrambwjnsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396796.172802-64-218399355721173/AnsiballZ_file.py'
Nov 29 06:13:16 compute-1 sudo[59375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:16 compute-1 python3.9[59377]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:16 compute-1 sudo[59375]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:17 compute-1 sudo[59550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddjbhxenejpguyqfjplblwxhsknqfuho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396797.1468072-88-185448013696681/AnsiballZ_stat.py'
Nov 29 06:13:17 compute-1 sudo[59550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:17 compute-1 python3.9[59552]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:17 compute-1 sudo[59550]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:18 compute-1 sudo[59673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uecstenrwgtytddqdogrxedufsvltepw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396797.1468072-88-185448013696681/AnsiballZ_copy.py'
Nov 29 06:13:18 compute-1 sudo[59673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:18 compute-1 python3.9[59675]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764396797.1468072-88-185448013696681/.source.json _original_basename=.vkia39fo follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:18 compute-1 sudo[59673]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:19 compute-1 sudo[59825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvncblhlnctsdfixsoxsmhnytmszeuxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396799.286177-157-152693036389979/AnsiballZ_stat.py'
Nov 29 06:13:19 compute-1 sudo[59825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:19 compute-1 python3.9[59827]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:19 compute-1 sudo[59825]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:20 compute-1 sudo[59950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umlntxxwrwojutjzqoeahvdiwgtucpot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396799.286177-157-152693036389979/AnsiballZ_copy.py'
Nov 29 06:13:20 compute-1 sudo[59950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:20 compute-1 python3.9[59952]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396799.286177-157-152693036389979/.source _original_basename=.pqini69z follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:20 compute-1 sudo[59950]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:21 compute-1 sshd-session[59878]: Invalid user weblogic from 118.194.230.250 port 49648
Nov 29 06:13:21 compute-1 sshd-session[59878]: Received disconnect from 118.194.230.250 port 49648:11: Bye Bye [preauth]
Nov 29 06:13:21 compute-1 sshd-session[59878]: Disconnected from invalid user weblogic 118.194.230.250 port 49648 [preauth]
Nov 29 06:13:21 compute-1 sudo[60102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkxprstzticqciwdlaocrqhrbldpcntk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396801.0741131-205-212870248660647/AnsiballZ_file.py'
Nov 29 06:13:21 compute-1 sudo[60102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:21 compute-1 python3.9[60104]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:13:21 compute-1 sudo[60102]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:22 compute-1 sudo[60254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkobudtiiiuucasvjkmxhlgpmdskisfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396801.9029558-229-161521784381972/AnsiballZ_stat.py'
Nov 29 06:13:22 compute-1 sudo[60254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:22 compute-1 python3.9[60256]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:22 compute-1 sudo[60254]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:22 compute-1 sudo[60377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyorahlofavbeytlmllvkvbhtsfgmqxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396801.9029558-229-161521784381972/AnsiballZ_copy.py'
Nov 29 06:13:22 compute-1 sudo[60377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:23 compute-1 python3.9[60379]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396801.9029558-229-161521784381972/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:13:23 compute-1 sudo[60377]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:23 compute-1 sudo[60529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jahthrqgmnvcjzuygntwadsvzmullrca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396803.1960824-229-136296519110047/AnsiballZ_stat.py'
Nov 29 06:13:23 compute-1 sudo[60529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:23 compute-1 python3.9[60531]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:23 compute-1 sudo[60529]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:24 compute-1 sudo[60652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xynjyglatwulxmklpxgrpcumyzulniio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396803.1960824-229-136296519110047/AnsiballZ_copy.py'
Nov 29 06:13:24 compute-1 sudo[60652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:24 compute-1 python3.9[60654]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396803.1960824-229-136296519110047/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:13:24 compute-1 sudo[60652]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:24 compute-1 sudo[60804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmufrqqqgrhpsbxxpxmkierbdfyrfnku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396804.5805173-316-176770587923908/AnsiballZ_file.py'
Nov 29 06:13:24 compute-1 sudo[60804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:25 compute-1 python3.9[60806]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:25 compute-1 sudo[60804]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:25 compute-1 sudo[60956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twbuogbnghgawcweorynovsqqakwridv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396805.4823601-341-129878485737285/AnsiballZ_stat.py'
Nov 29 06:13:25 compute-1 sudo[60956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:26 compute-1 python3.9[60958]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:26 compute-1 sudo[60956]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:26 compute-1 sudo[61079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewpimjjculazftutvkrpmtqlegjnhoez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396805.4823601-341-129878485737285/AnsiballZ_copy.py'
Nov 29 06:13:26 compute-1 sudo[61079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:26 compute-1 python3.9[61081]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396805.4823601-341-129878485737285/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:26 compute-1 sudo[61079]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:27 compute-1 sudo[61231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyzdhdguybmpzoemtjiuslpyrossdwbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396806.9583416-385-192888254872458/AnsiballZ_stat.py'
Nov 29 06:13:27 compute-1 sudo[61231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:27 compute-1 python3.9[61233]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:27 compute-1 sudo[61231]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:27 compute-1 sudo[61354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mngfgoaqvqqbbbjnclyupyusumhyfidu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396806.9583416-385-192888254872458/AnsiballZ_copy.py'
Nov 29 06:13:27 compute-1 sudo[61354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:28 compute-1 python3.9[61356]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396806.9583416-385-192888254872458/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:28 compute-1 sudo[61354]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:29 compute-1 sudo[61506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buziakgkvkzhgcjsylwjxgkjzhakjztk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396808.562901-430-32466180758672/AnsiballZ_systemd.py'
Nov 29 06:13:29 compute-1 sudo[61506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:29 compute-1 python3.9[61508]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:13:29 compute-1 systemd[1]: Reloading.
Nov 29 06:13:29 compute-1 systemd-sysv-generator[61539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:29 compute-1 systemd-rc-local-generator[61535]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:30 compute-1 systemd[1]: Reloading.
Nov 29 06:13:30 compute-1 systemd-sysv-generator[61576]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:30 compute-1 systemd-rc-local-generator[61572]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:30 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Nov 29 06:13:30 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Nov 29 06:13:30 compute-1 sudo[61506]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:31 compute-1 sudo[61732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auykfbrtbpssnnobclsjhpxxfsbytyvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396810.7913022-454-131646669638305/AnsiballZ_stat.py'
Nov 29 06:13:31 compute-1 sudo[61732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:31 compute-1 python3.9[61734]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:31 compute-1 sudo[61732]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:31 compute-1 sudo[61855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlzxpwrrdxuxcbhzhwhzrzsnfsiubimv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396810.7913022-454-131646669638305/AnsiballZ_copy.py'
Nov 29 06:13:31 compute-1 sudo[61855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:32 compute-1 python3.9[61857]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396810.7913022-454-131646669638305/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:32 compute-1 sudo[61855]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:32 compute-1 sudo[62007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tffmrlvnbzkbenmlmjbfapnfahybikgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396812.2790513-499-206083136035783/AnsiballZ_stat.py'
Nov 29 06:13:32 compute-1 sudo[62007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:32 compute-1 python3.9[62009]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:32 compute-1 sudo[62007]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:33 compute-1 sudo[62130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmkiworomvsyldnasjlbslabsxzruwgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396812.2790513-499-206083136035783/AnsiballZ_copy.py'
Nov 29 06:13:33 compute-1 sudo[62130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:33 compute-1 python3.9[62132]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396812.2790513-499-206083136035783/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:33 compute-1 sudo[62130]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:34 compute-1 sudo[62282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntiilspnomrxvlytxmikmlfddkwjgygj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396813.7702658-544-54182125901149/AnsiballZ_systemd.py'
Nov 29 06:13:34 compute-1 sudo[62282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:34 compute-1 python3.9[62284]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:13:34 compute-1 systemd[1]: Reloading.
Nov 29 06:13:34 compute-1 systemd-rc-local-generator[62313]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:34 compute-1 systemd-sysv-generator[62316]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:34 compute-1 systemd[1]: Reloading.
Nov 29 06:13:34 compute-1 systemd-rc-local-generator[62348]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:34 compute-1 systemd-sysv-generator[62351]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:35 compute-1 systemd[1]: Starting Create netns directory...
Nov 29 06:13:35 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:13:35 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:13:35 compute-1 systemd[1]: Finished Create netns directory.
Nov 29 06:13:35 compute-1 sudo[62282]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:36 compute-1 python3.9[62511]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:13:36 compute-1 network[62528]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:13:36 compute-1 network[62529]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:13:36 compute-1 network[62530]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:13:40 compute-1 sudo[62790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntgmyppzteabfhrzohsyquitghpjmgwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396820.5420706-592-111120372204788/AnsiballZ_systemd.py'
Nov 29 06:13:40 compute-1 sudo[62790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:41 compute-1 python3.9[62792]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:13:41 compute-1 systemd[1]: Reloading.
Nov 29 06:13:41 compute-1 systemd-rc-local-generator[62822]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:41 compute-1 systemd-sysv-generator[62825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:41 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 29 06:13:41 compute-1 iptables.init[62832]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 29 06:13:42 compute-1 iptables.init[62832]: iptables: Flushing firewall rules: [  OK  ]
Nov 29 06:13:42 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Nov 29 06:13:42 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 29 06:13:42 compute-1 sudo[62790]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:42 compute-1 sudo[63026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqphrtctebfiiftgguhvnilvmymjyzxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396822.2915704-592-280379044413500/AnsiballZ_systemd.py'
Nov 29 06:13:42 compute-1 sudo[63026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:43 compute-1 python3.9[63028]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:13:43 compute-1 sudo[63026]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:44 compute-1 sudo[63180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfyrhdvgprdivllbkyyrhsuwgcmhassu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396824.3668594-640-64694446527361/AnsiballZ_systemd.py'
Nov 29 06:13:44 compute-1 sudo[63180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:45 compute-1 python3.9[63182]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:13:45 compute-1 systemd[1]: Reloading.
Nov 29 06:13:45 compute-1 systemd-sysv-generator[63216]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:13:45 compute-1 systemd-rc-local-generator[63212]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:13:45 compute-1 systemd[1]: Starting Netfilter Tables...
Nov 29 06:13:45 compute-1 systemd[1]: Finished Netfilter Tables.
Nov 29 06:13:45 compute-1 sudo[63180]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:46 compute-1 sudo[63372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwuruhurmnqfjztzkkvjfwooygyeubfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396825.799267-664-141967473333912/AnsiballZ_command.py'
Nov 29 06:13:46 compute-1 sudo[63372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:46 compute-1 python3.9[63374]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:13:46 compute-1 sudo[63372]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:47 compute-1 sudo[63525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbguhwlmusbnojxnyfyrunfmugsumrmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396827.1233733-706-270345891972877/AnsiballZ_stat.py'
Nov 29 06:13:47 compute-1 sudo[63525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:47 compute-1 python3.9[63527]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:47 compute-1 sudo[63525]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:48 compute-1 sudo[63650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nprcvinkorjrwmnqfxcvpnurqfnrnoez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396827.1233733-706-270345891972877/AnsiballZ_copy.py'
Nov 29 06:13:48 compute-1 sudo[63650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:48 compute-1 python3.9[63652]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396827.1233733-706-270345891972877/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:48 compute-1 sudo[63650]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:49 compute-1 sudo[63803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqykggjpimqdovoxkhvatjpkifsnfwel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396828.7471647-751-16699167849789/AnsiballZ_systemd.py'
Nov 29 06:13:49 compute-1 sudo[63803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:49 compute-1 python3.9[63805]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:13:49 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Nov 29 06:13:49 compute-1 sshd[1007]: Received SIGHUP; restarting.
Nov 29 06:13:49 compute-1 sshd[1007]: Server listening on 0.0.0.0 port 22.
Nov 29 06:13:49 compute-1 sshd[1007]: Server listening on :: port 22.
Nov 29 06:13:49 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Nov 29 06:13:49 compute-1 sudo[63803]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:50 compute-1 sudo[63959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwcjamkehtdmdvtvzxblvxotatlqksen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396829.7318156-775-119545020584831/AnsiballZ_file.py'
Nov 29 06:13:50 compute-1 sudo[63959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:50 compute-1 python3.9[63961]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:50 compute-1 sudo[63959]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:50 compute-1 sudo[64111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prefxvxddirgqiyyvcnqiikrafplturq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396830.5914392-799-101510981831214/AnsiballZ_stat.py'
Nov 29 06:13:50 compute-1 sudo[64111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:51 compute-1 python3.9[64113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:51 compute-1 sudo[64111]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:51 compute-1 sudo[64234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynfdwkeftshotgsvpuwqkcpbfvkhiuag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396830.5914392-799-101510981831214/AnsiballZ_copy.py'
Nov 29 06:13:51 compute-1 sudo[64234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:51 compute-1 python3.9[64236]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396830.5914392-799-101510981831214/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:51 compute-1 sudo[64234]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:52 compute-1 sudo[64386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyiaxvavzpylaavhifseyywawhnhflss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396832.4466426-853-155333993869156/AnsiballZ_timezone.py'
Nov 29 06:13:52 compute-1 sudo[64386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:53 compute-1 python3.9[64388]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 06:13:53 compute-1 systemd[1]: Starting Time & Date Service...
Nov 29 06:13:53 compute-1 systemd[1]: Started Time & Date Service.
Nov 29 06:13:53 compute-1 sudo[64386]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:54 compute-1 sudo[64542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aahpfanhwmntzrkddzjuubgbimkphikd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396833.6904738-880-169627177138013/AnsiballZ_file.py'
Nov 29 06:13:54 compute-1 sudo[64542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:54 compute-1 python3.9[64544]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:54 compute-1 sudo[64542]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:54 compute-1 sudo[64694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uavgzrgzgwfjvhvyhyyrlafypzleuosw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396834.504613-904-5399701062953/AnsiballZ_stat.py'
Nov 29 06:13:54 compute-1 sudo[64694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:55 compute-1 python3.9[64696]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:55 compute-1 sudo[64694]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:55 compute-1 sudo[64817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iduwpjndsiplskqdjgjybuwuyqjajafm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396834.504613-904-5399701062953/AnsiballZ_copy.py'
Nov 29 06:13:55 compute-1 sudo[64817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:55 compute-1 python3.9[64819]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396834.504613-904-5399701062953/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:55 compute-1 sudo[64817]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:56 compute-1 sudo[64969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdpssvxidatrnmmucuyygtstlucdzhlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396836.004822-950-188393539029024/AnsiballZ_stat.py'
Nov 29 06:13:56 compute-1 sudo[64969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:56 compute-1 python3.9[64971]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:56 compute-1 sudo[64969]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:57 compute-1 sudo[65092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmqobclggbxnmqlbwkjkhwueuxjzpmpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396836.004822-950-188393539029024/AnsiballZ_copy.py'
Nov 29 06:13:57 compute-1 sudo[65092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:57 compute-1 python3.9[65094]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396836.004822-950-188393539029024/.source.yaml _original_basename=.imv88qg5 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:57 compute-1 sudo[65092]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:57 compute-1 sudo[65244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsrraolniwirloawdxjjsvcnilripjri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396837.575449-994-70796429740603/AnsiballZ_stat.py'
Nov 29 06:13:57 compute-1 sudo[65244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:58 compute-1 python3.9[65246]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:13:58 compute-1 sudo[65244]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:58 compute-1 sudo[65367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygmdraqksiwkwvbwqnfmxrbrhyogcrxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396837.575449-994-70796429740603/AnsiballZ_copy.py'
Nov 29 06:13:58 compute-1 sudo[65367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:58 compute-1 python3.9[65369]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396837.575449-994-70796429740603/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:13:58 compute-1 sudo[65367]: pam_unix(sudo:session): session closed for user root
Nov 29 06:13:59 compute-1 sudo[65519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkuigdoblohmvtwzfexiphdgtwjapijb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396839.2028787-1039-153685522202173/AnsiballZ_command.py'
Nov 29 06:13:59 compute-1 sudo[65519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:13:59 compute-1 python3.9[65521]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:13:59 compute-1 sudo[65519]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:00 compute-1 sudo[65672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuvlwjsuqmqisxujiyyvjypprrwvteaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396840.1231658-1063-210239103673167/AnsiballZ_command.py'
Nov 29 06:14:00 compute-1 sudo[65672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:00 compute-1 python3.9[65674]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:14:00 compute-1 sudo[65672]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:01 compute-1 sudo[65825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thzenajklkaxfgkkonjvzzzmcbzlwhcf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764396841.0195966-1087-168744466232391/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:14:01 compute-1 sudo[65825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:01 compute-1 python3[65827]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:14:01 compute-1 sudo[65825]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:02 compute-1 sudo[65977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnvoljdmoozhejgspjigubylpnwwxfnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396841.9217458-1111-149388908292550/AnsiballZ_stat.py'
Nov 29 06:14:02 compute-1 sudo[65977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:02 compute-1 python3.9[65979]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:14:02 compute-1 sudo[65977]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:03 compute-1 sudo[66100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlcxpqiphomuxjfzhddoscufqvazbeui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396841.9217458-1111-149388908292550/AnsiballZ_copy.py'
Nov 29 06:14:03 compute-1 sudo[66100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:03 compute-1 python3.9[66102]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396841.9217458-1111-149388908292550/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:03 compute-1 sudo[66100]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:03 compute-1 sudo[66252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddoyginonzxzxhlqstuhysdtdgyzuceg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396843.5359545-1156-265707800178324/AnsiballZ_stat.py'
Nov 29 06:14:03 compute-1 sudo[66252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:04 compute-1 python3.9[66254]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:14:04 compute-1 sudo[66252]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:04 compute-1 sudo[66375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwgsjghxammwmdpyqdfriurepvsdgqbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396843.5359545-1156-265707800178324/AnsiballZ_copy.py'
Nov 29 06:14:04 compute-1 sudo[66375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:04 compute-1 python3.9[66377]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396843.5359545-1156-265707800178324/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:04 compute-1 sudo[66375]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:05 compute-1 sudo[66527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgapmoyejorfgqymymmrmphaqmksxbil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396845.142922-1201-184815662986162/AnsiballZ_stat.py'
Nov 29 06:14:05 compute-1 sudo[66527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:05 compute-1 python3.9[66529]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:14:05 compute-1 sudo[66527]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:06 compute-1 sudo[66650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sparqpmrijhmrvfakrqznypmlcibilfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396845.142922-1201-184815662986162/AnsiballZ_copy.py'
Nov 29 06:14:06 compute-1 sudo[66650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:06 compute-1 python3.9[66652]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396845.142922-1201-184815662986162/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:06 compute-1 sudo[66650]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:07 compute-1 sudo[66802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enaaiihrgpqmzkdzkxjurjidvrjmzbtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396846.664679-1246-247228150418312/AnsiballZ_stat.py'
Nov 29 06:14:07 compute-1 sudo[66802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:07 compute-1 python3.9[66804]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:14:07 compute-1 sudo[66802]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:07 compute-1 sudo[66925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtsridfqzniechjgiggmsojoufmdhgpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396846.664679-1246-247228150418312/AnsiballZ_copy.py'
Nov 29 06:14:07 compute-1 sudo[66925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:07 compute-1 python3.9[66927]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396846.664679-1246-247228150418312/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:07 compute-1 sudo[66925]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:08 compute-1 sudo[67077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfexaafphrcmdtxetyhujgjaiogdhdly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396848.2022111-1291-160402387134504/AnsiballZ_stat.py'
Nov 29 06:14:08 compute-1 sudo[67077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:08 compute-1 python3.9[67079]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:14:08 compute-1 sudo[67077]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:09 compute-1 sudo[67200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lriygpcmmfzpkcfvpauxperwmzgiurtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396848.2022111-1291-160402387134504/AnsiballZ_copy.py'
Nov 29 06:14:09 compute-1 sudo[67200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:09 compute-1 python3.9[67202]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396848.2022111-1291-160402387134504/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:09 compute-1 sudo[67200]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:10 compute-1 sudo[67352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxhfuvqpycgkyhuzscfkvmrzubcifdqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396849.7761815-1336-194649538732462/AnsiballZ_file.py'
Nov 29 06:14:10 compute-1 sudo[67352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:10 compute-1 python3.9[67354]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:10 compute-1 sudo[67352]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:11 compute-1 sudo[67504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlaemqhxhxitekqjniniajrucnhhmtka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396850.6644094-1360-116797964012299/AnsiballZ_command.py'
Nov 29 06:14:11 compute-1 sudo[67504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:11 compute-1 python3.9[67506]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:14:11 compute-1 sudo[67504]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:12 compute-1 sudo[67663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzsmccikdcwslvjkjpzszllcrufxmshr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396851.636633-1384-134050194697313/AnsiballZ_blockinfile.py'
Nov 29 06:14:12 compute-1 sudo[67663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:12 compute-1 python3.9[67665]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:12 compute-1 sudo[67663]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:13 compute-1 sudo[67816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjhmahrhdpkzzwyuttpnroozwlhcpojy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396852.8573902-1411-65783758750735/AnsiballZ_file.py'
Nov 29 06:14:13 compute-1 sudo[67816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:13 compute-1 python3.9[67818]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:13 compute-1 sudo[67816]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:14 compute-1 sudo[67968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbepwbcxydmrdlekhagniyfdinspmfdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396853.88541-1411-112391235978986/AnsiballZ_file.py'
Nov 29 06:14:14 compute-1 sudo[67968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:14 compute-1 python3.9[67970]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:14 compute-1 sudo[67968]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:15 compute-1 sudo[68120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txedkdnlvnpyjsxpfrmhldakxmsltmwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396854.802786-1456-86470528127881/AnsiballZ_mount.py'
Nov 29 06:14:15 compute-1 sudo[68120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:15 compute-1 python3.9[68122]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 06:14:15 compute-1 sudo[68120]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:16 compute-1 sudo[68273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnlinhcumxhtsegguoqismwwnyezicou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396855.7453325-1456-10792384573605/AnsiballZ_mount.py'
Nov 29 06:14:16 compute-1 sudo[68273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:16 compute-1 python3.9[68275]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 06:14:16 compute-1 sudo[68273]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:16 compute-1 sshd-session[59071]: Connection closed by 192.168.122.30 port 43192
Nov 29 06:14:16 compute-1 sshd-session[59068]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:14:16 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Nov 29 06:14:16 compute-1 systemd[1]: session-15.scope: Consumed 43.077s CPU time.
Nov 29 06:14:16 compute-1 systemd-logind[785]: Session 15 logged out. Waiting for processes to exit.
Nov 29 06:14:16 compute-1 systemd-logind[785]: Removed session 15.
Nov 29 06:14:23 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 06:14:26 compute-1 sshd-session[68304]: Accepted publickey for zuul from 192.168.122.30 port 51726 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:14:26 compute-1 systemd-logind[785]: New session 16 of user zuul.
Nov 29 06:14:26 compute-1 systemd[1]: Started Session 16 of User zuul.
Nov 29 06:14:26 compute-1 sshd-session[68304]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:14:27 compute-1 sudo[68457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwpkfwsfqajzgziswndtzdkjftmrnmvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396866.681163-24-105313207986541/AnsiballZ_tempfile.py'
Nov 29 06:14:27 compute-1 sudo[68457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:27 compute-1 python3.9[68459]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 29 06:14:27 compute-1 sudo[68457]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:28 compute-1 sshd-session[68536]: Received disconnect from 45.55.249.98 port 58176:11: Bye Bye [preauth]
Nov 29 06:14:28 compute-1 sshd-session[68536]: Disconnected from authenticating user root 45.55.249.98 port 58176 [preauth]
Nov 29 06:14:28 compute-1 sudo[68611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avlyhdzmgerdwpqxpwiigfdrlkkrhion ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396867.6326113-60-177857779602464/AnsiballZ_stat.py'
Nov 29 06:14:28 compute-1 sudo[68611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:28 compute-1 python3.9[68613]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:14:28 compute-1 sudo[68611]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:29 compute-1 sudo[68763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snmthzzfcnbttcqnkqrklselzvtjcdjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396868.695148-90-53464588821604/AnsiballZ_setup.py'
Nov 29 06:14:29 compute-1 sudo[68763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:29 compute-1 python3.9[68765]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:14:29 compute-1 sudo[68763]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:30 compute-1 sudo[68915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxrhusgyntjalehqijltwqswefeuvrax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396869.9784584-115-95100458493551/AnsiballZ_blockinfile.py'
Nov 29 06:14:30 compute-1 sudo[68915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:30 compute-1 python3.9[68917]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCX0dhB1m0xL0qEi5jnTQLLB4bvueVV5foNrqU/OkfV/4gRyp7uP2q21lWq5Dtl2GLk51pS6oD41RI41Y5g7OSRs8b1Z66d6X1QgX0Qns6pv7FwmNSQ25+2VGV6lppnaN5e+JHiwTmzpf82hl/MiiJrHo7B63mllKyl9SZJxUhP9RR4czS3QNYQsZyP7sZeCWothTZ2Q/GK4BWBEtj2+ifeOpa342IivopCH05YVQOx9bpsdFHMYaalMDCwvr2lfVns8aTcpJ3z9uE8wLdKWTyiinT7nuLX6RuPwhXB2proBRH1wrGSIUgcVcizkWn8QizD8LlsGFcHIQJkmq+sJz6r7cCZLIfS6hdAzI+hYbJie6n/agwfxe4r+mbXsmmC6ALKKk7CEnaiNnDg0fgTaUfBPwSfu+JmVrjdSO+S8f/CMbtYeO6QknOxhLV9oK6knszv7nLlSYXTzXanHkN4Y0fW3dsSvoE+qDR0YijbbT8slqMd6z95wWVDFUmTcN8Nzk8=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILci1PI4hoB56+xxS5gSMKceuJ/dv6t7etpmtENwoSFr
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJIaOLr2ntjSUcigXC7a0sFoonsuh0ChCx2a1R6G8EDmJ8/ZB8NEiJE6KAQJDNU5XsXjuaC44eJhOUMRK9r98xA=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2GXKCQiCwQEMihcSwDVeJtG2CpTemmA6MTbtOkxbB3OAV5PK8v8imPvDGMDurfGFQG0RzWyv9szlMJXdgIkwejIfy/AY7p6nemHOpu6DdAx0EA/jg1YcOIeeEhyMw1/oFzjYClGMohaI1oTKHtR29UXWphTAroOkf26Exvco6hh2ApRTXV9ObzSoOyCC7+OZcOWgYzdoCfu/0FDGkH2ksKLQS7d4AAh/XZ/njXhK57U7ptxHCReUPECGRv7KB4f8TelZDAIeUyp7ngd/9ivUDO1zue1Qr9ECzTzAFqippGXFmYl3+oSid03CY7bqnxav4xWt7UukbaO57goyIPfkklPdC1kA7kZqa9bqeDU1WgDkqnLu8hluArB0Y0Jz+hDfx9pTbAL6MklraoLaGrnrgcibAollAN+7WGqdWxUotENYaljO7P1Z18MlNllWFzk4Le5jMLNL8qArSlzM+ufOThnLdGEuYZhH1x969AisGQ4MQWn0P0lZFu6fE5VSNA/k=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDdPWx5WoFJTxz6PiFZL5f3XrtE682RjGFiIpoe0LXZO
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQlZMweHfLYiJFtm1r2tQze/oNx6KzgaXkK+Kof7POk0cFMLbTsXU8qgbQMh4o5LVO0Hbas4mAqxRkGcFCg2Po=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUVpPatup3d17omeiTdJaYR8jCcDbraJSPBxWy49Wxst4G+6/lD41HVIKmjgCgIbbmYSFBPQmoXt4gFXP4FRKna6AbQWi0kwF3/T2biQ2qCid0HVDSS8YRVlyrpdVc1/bIg6YNLkGnhzOMp0S1443+cg5PqutAbrAT1LOg6lSBu+K9gIqJ4un3l2guSweoyba5UhMyjrq4Pffx1QCuBggtYSjmA9Q1r5VVNc2J7AbP0QuzOe6J6DhpdGJsfmHDVXZb/4b/aPUdCTKkLseyUtcqElWVhhnGnpYSJdN81ejalSktGHE4JRHih19wwTokiKvoczUgijBzOfl+kt2ELcpDgzpzY0M9yd0Zz7wrK4rLM6hi8x3LYZXZv8N7KnawUcJ2jfzilx1BVLdNzgwDNB7ZlP4O9Vs3fKnBufCUFPNcRyWl6ooczepbgxqgSbr/Ham2O4/qzvJmzLtu0KxBkaFALRWnyM39nYVE/jrMKJ5ihtVDxIY9FGma/Jifg15gqI0=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN19pK3a7AH/OiwlqJTVWP/qzU/QzkC16s4D1xY1Vn6J
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLsXsjJNPVMX1YVTe2oBmcZpUSiv3HOeuICgZtQun4hTopMXH9dE1jQeUruGwqZ+NsKW6X2bLZZJ0/tcn2owL8Q=
                                             create=True mode=0644 path=/tmp/ansible.oiwd6zah state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:30 compute-1 sudo[68915]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:31 compute-1 sudo[69067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfaytfeqwfiymyiaegfnereymcdmfquv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396870.9493945-139-113610343168990/AnsiballZ_command.py'
Nov 29 06:14:31 compute-1 sudo[69067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:31 compute-1 python3.9[69069]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.oiwd6zah' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:14:31 compute-1 sudo[69067]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:32 compute-1 sudo[69221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bllovibkfhwskfaudatpsezgzihkicnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396871.8595557-163-221559175113718/AnsiballZ_file.py'
Nov 29 06:14:32 compute-1 sudo[69221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:32 compute-1 python3.9[69223]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.oiwd6zah state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:32 compute-1 sudo[69221]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:32 compute-1 sshd-session[68307]: Connection closed by 192.168.122.30 port 51726
Nov 29 06:14:32 compute-1 sshd-session[68304]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:14:32 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Nov 29 06:14:33 compute-1 systemd[1]: session-16.scope: Consumed 4.297s CPU time.
Nov 29 06:14:33 compute-1 systemd-logind[785]: Session 16 logged out. Waiting for processes to exit.
Nov 29 06:14:33 compute-1 systemd-logind[785]: Removed session 16.
Nov 29 06:14:35 compute-1 sshd-session[69248]: Invalid user prueba from 118.194.230.250 port 49744
Nov 29 06:14:35 compute-1 sshd-session[69248]: Received disconnect from 118.194.230.250 port 49744:11: Bye Bye [preauth]
Nov 29 06:14:35 compute-1 sshd-session[69248]: Disconnected from invalid user prueba 118.194.230.250 port 49744 [preauth]
Nov 29 06:14:39 compute-1 sshd-session[69250]: Accepted publickey for zuul from 192.168.122.30 port 46962 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:14:39 compute-1 systemd-logind[785]: New session 17 of user zuul.
Nov 29 06:14:39 compute-1 systemd[1]: Started Session 17 of User zuul.
Nov 29 06:14:39 compute-1 sshd-session[69250]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:14:40 compute-1 python3.9[69403]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:14:41 compute-1 sudo[69557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvkgjpcqikrryoqrgyrwgivznsqvmzsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396880.6042156-62-170438075286525/AnsiballZ_systemd.py'
Nov 29 06:14:41 compute-1 sudo[69557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:41 compute-1 python3.9[69559]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 06:14:41 compute-1 sudo[69557]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:42 compute-1 sudo[69711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sewhambfkdzmomhvnehhbdbcoyvehzle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396881.823234-86-59451285013844/AnsiballZ_systemd.py'
Nov 29 06:14:42 compute-1 sudo[69711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:42 compute-1 python3.9[69713]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:14:43 compute-1 sudo[69711]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:44 compute-1 sudo[69864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrrsmdffhjvaveohkcyzgoqnecdblxus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396883.8909078-114-192603184450543/AnsiballZ_command.py'
Nov 29 06:14:44 compute-1 sudo[69864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:44 compute-1 python3.9[69866]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:14:44 compute-1 sudo[69864]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:45 compute-1 sudo[70017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikdzcntopyomvkvtvsdowarscvwlcbqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396884.8103733-137-25840010727556/AnsiballZ_stat.py'
Nov 29 06:14:45 compute-1 sudo[70017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:45 compute-1 python3.9[70019]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:14:45 compute-1 sudo[70017]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:46 compute-1 sudo[70171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dizttqxjqkfwjmcovlrtiwvlauwgvyka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396885.7774978-161-84016967200203/AnsiballZ_command.py'
Nov 29 06:14:46 compute-1 sudo[70171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:46 compute-1 python3.9[70173]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:14:46 compute-1 sudo[70171]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:47 compute-1 sudo[70326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqusgbnraxiyqtomzvayvlwpyxiymjhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396886.6037037-185-216317432839091/AnsiballZ_file.py'
Nov 29 06:14:47 compute-1 sudo[70326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:47 compute-1 python3.9[70328]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:14:47 compute-1 sudo[70326]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:47 compute-1 sshd-session[69253]: Connection closed by 192.168.122.30 port 46962
Nov 29 06:14:47 compute-1 sshd-session[69250]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:14:47 compute-1 systemd-logind[785]: Session 17 logged out. Waiting for processes to exit.
Nov 29 06:14:47 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Nov 29 06:14:47 compute-1 systemd[1]: session-17.scope: Consumed 5.478s CPU time.
Nov 29 06:14:47 compute-1 systemd-logind[785]: Removed session 17.
Nov 29 06:14:53 compute-1 sshd-session[70353]: Accepted publickey for zuul from 192.168.122.30 port 44022 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:14:53 compute-1 systemd-logind[785]: New session 18 of user zuul.
Nov 29 06:14:53 compute-1 systemd[1]: Started Session 18 of User zuul.
Nov 29 06:14:53 compute-1 sshd-session[70353]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:14:54 compute-1 python3.9[70506]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:14:55 compute-1 sudo[70660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wghofaikhzmzqlsnedfhzrneooxvmenj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396894.8703973-68-268952561376681/AnsiballZ_setup.py'
Nov 29 06:14:55 compute-1 sudo[70660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:55 compute-1 python3.9[70662]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:14:55 compute-1 sudo[70660]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:56 compute-1 sudo[70744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywncspmqlongjainibmlgepthdjnkvwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764396894.8703973-68-268952561376681/AnsiballZ_dnf.py'
Nov 29 06:14:56 compute-1 sudo[70744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:14:56 compute-1 python3.9[70746]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:14:57 compute-1 sudo[70744]: pam_unix(sudo:session): session closed for user root
Nov 29 06:14:58 compute-1 python3.9[70897]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:15:00 compute-1 python3.9[71048]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:15:00 compute-1 python3.9[71198]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:15:00 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:15:00 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:15:01 compute-1 python3.9[71349]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:15:02 compute-1 sshd-session[70356]: Connection closed by 192.168.122.30 port 44022
Nov 29 06:15:02 compute-1 sshd-session[70353]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:15:02 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Nov 29 06:15:02 compute-1 systemd[1]: session-18.scope: Consumed 6.848s CPU time.
Nov 29 06:15:02 compute-1 systemd-logind[785]: Session 18 logged out. Waiting for processes to exit.
Nov 29 06:15:02 compute-1 systemd-logind[785]: Removed session 18.
Nov 29 06:15:09 compute-1 chronyd[58583]: Selected source 142.4.192.253 (pool.ntp.org)
Nov 29 06:15:11 compute-1 sshd-session[71375]: Accepted publickey for zuul from 38.102.83.107 port 46652 ssh2: RSA SHA256:MGJJb6X2bjkH8oWT85dgz2a/TwKBbh3/GDOWF3tnPlY
Nov 29 06:15:11 compute-1 systemd-logind[785]: New session 19 of user zuul.
Nov 29 06:15:11 compute-1 systemd[1]: Started Session 19 of User zuul.
Nov 29 06:15:11 compute-1 sshd-session[71375]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:15:11 compute-1 sudo[71451]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlweoakxyduftlshsuvhhkfqxbmhzzml ; /usr/bin/python3'
Nov 29 06:15:11 compute-1 sudo[71451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:12 compute-1 useradd[71455]: new group: name=ceph-admin, GID=42478
Nov 29 06:15:12 compute-1 useradd[71455]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Nov 29 06:15:12 compute-1 sudo[71451]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:12 compute-1 sudo[71537]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfahmsgdpsaimulfknplawcipwrmouzc ; /usr/bin/python3'
Nov 29 06:15:12 compute-1 sudo[71537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:12 compute-1 sudo[71537]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:13 compute-1 sudo[71610]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqvhxfqnexlkwablvpalpmvkfpzmehvy ; /usr/bin/python3'
Nov 29 06:15:13 compute-1 sudo[71610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:13 compute-1 sudo[71610]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:13 compute-1 sudo[71660]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qazbrfyefvzethsqurxmkgpozauvnjuq ; /usr/bin/python3'
Nov 29 06:15:13 compute-1 sudo[71660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:13 compute-1 sudo[71660]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:14 compute-1 sudo[71686]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmqjvziejmlsvgsodnesqprwuykmiass ; /usr/bin/python3'
Nov 29 06:15:14 compute-1 sudo[71686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:14 compute-1 sudo[71686]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:14 compute-1 sudo[71712]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psxjtiegnlphxitnqsxhzlhhwyzdhivo ; /usr/bin/python3'
Nov 29 06:15:14 compute-1 sudo[71712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:14 compute-1 sudo[71712]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:15 compute-1 sudo[71738]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbsrfmsrgtjgsnmnnjtrpdaupbqxwjul ; /usr/bin/python3'
Nov 29 06:15:15 compute-1 sudo[71738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:15 compute-1 sudo[71738]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:15 compute-1 sudo[71816]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykgxgctdqtgovmhuzrxsiukupymmrajm ; /usr/bin/python3'
Nov 29 06:15:15 compute-1 sudo[71816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:15 compute-1 sudo[71816]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:16 compute-1 sudo[71889]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uutsokbuiyzorzyviywyxurlbhdlhntv ; /usr/bin/python3'
Nov 29 06:15:16 compute-1 sudo[71889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:16 compute-1 sudo[71889]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:16 compute-1 sudo[71991]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjzjbmyvmuspyrmdqxahkucfydalrhlc ; /usr/bin/python3'
Nov 29 06:15:16 compute-1 sudo[71991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:16 compute-1 sudo[71991]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:17 compute-1 sudo[72064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpmjiutlvgaabhdcepwzbiqseyqetfgs ; /usr/bin/python3'
Nov 29 06:15:17 compute-1 sudo[72064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:17 compute-1 sudo[72064]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:17 compute-1 sudo[72114]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbekbfekskhecohwofkrhpmlazmlqsrq ; /usr/bin/python3'
Nov 29 06:15:17 compute-1 sudo[72114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:18 compute-1 python3[72116]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:15:19 compute-1 sudo[72114]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:19 compute-1 sudo[72209]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdzosklrdyrwgmrzrrahrkimexbmvumr ; /usr/bin/python3'
Nov 29 06:15:19 compute-1 sudo[72209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:20 compute-1 python3[72211]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 29 06:15:21 compute-1 sudo[72209]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:21 compute-1 sudo[72236]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwxrljdaaynvwndzouljwsmywdpacycj ; /usr/bin/python3'
Nov 29 06:15:21 compute-1 sudo[72236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:21 compute-1 python3[72238]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 29 06:15:21 compute-1 sudo[72236]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:22 compute-1 sudo[72262]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxiwbbbpvjrxavzsrnjttfhkyzxfrrig ; /usr/bin/python3'
Nov 29 06:15:22 compute-1 sudo[72262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:22 compute-1 python3[72264]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:15:22 compute-1 kernel: loop: module loaded
Nov 29 06:15:22 compute-1 kernel: loop3: detected capacity change from 0 to 14680064
Nov 29 06:15:22 compute-1 sudo[72262]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:22 compute-1 sudo[72297]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ponvobaqszponoxqojbjbxndtykqmove ; /usr/bin/python3'
Nov 29 06:15:22 compute-1 sudo[72297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:22 compute-1 python3[72299]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:15:22 compute-1 lvm[72302]: PV /dev/loop3 not used.
Nov 29 06:15:22 compute-1 lvm[72304]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 06:15:22 compute-1 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Nov 29 06:15:23 compute-1 lvm[72314]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 06:15:23 compute-1 lvm[72314]: VG ceph_vg0 finished
Nov 29 06:15:23 compute-1 lvm[72312]:   1 logical volume(s) in volume group "ceph_vg0" now active
Nov 29 06:15:23 compute-1 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Nov 29 06:15:23 compute-1 sudo[72297]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:24 compute-1 sudo[72390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbpzqlxfxptwpczpdyusqvggygfwrvmd ; /usr/bin/python3'
Nov 29 06:15:24 compute-1 sudo[72390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:24 compute-1 python3[72392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 06:15:24 compute-1 sudo[72390]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:24 compute-1 sudo[72463]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ditfoahvshvwhjglfypdxtoclrxugxwt ; /usr/bin/python3'
Nov 29 06:15:24 compute-1 sudo[72463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:24 compute-1 python3[72465]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764396923.9252074-37029-138445971144281/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:15:24 compute-1 sudo[72463]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:25 compute-1 sudo[72513]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkumdxqcfnoconevziydoraktxgxxhqz ; /usr/bin/python3'
Nov 29 06:15:25 compute-1 sudo[72513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:15:25 compute-1 python3[72515]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:15:25 compute-1 systemd[1]: Reloading.
Nov 29 06:15:25 compute-1 systemd-sysv-generator[72545]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:15:25 compute-1 systemd-rc-local-generator[72536]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:15:26 compute-1 systemd[1]: Starting Ceph OSD losetup...
Nov 29 06:15:26 compute-1 bash[72556]: /dev/loop3: [64513]:4194937 (/var/lib/ceph-osd-0.img)
Nov 29 06:15:26 compute-1 systemd[1]: Finished Ceph OSD losetup.
Nov 29 06:15:26 compute-1 sudo[72513]: pam_unix(sudo:session): session closed for user root
Nov 29 06:15:26 compute-1 lvm[72560]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 06:15:26 compute-1 lvm[72560]: VG ceph_vg0 finished
Nov 29 06:15:27 compute-1 sshd-session[72561]: Invalid user ftpuser from 71.70.164.48 port 51430
Nov 29 06:15:28 compute-1 sshd-session[72561]: Received disconnect from 71.70.164.48 port 51430:11: Bye Bye [preauth]
Nov 29 06:15:28 compute-1 sshd-session[72561]: Disconnected from invalid user ftpuser 71.70.164.48 port 51430 [preauth]
Nov 29 06:15:28 compute-1 python3[72586]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:15:47 compute-1 sshd-session[72630]: Invalid user ubuntu from 45.55.249.98 port 43018
Nov 29 06:15:47 compute-1 sshd-session[72630]: Received disconnect from 45.55.249.98 port 43018:11: Bye Bye [preauth]
Nov 29 06:15:47 compute-1 sshd-session[72630]: Disconnected from invalid user ubuntu 45.55.249.98 port 43018 [preauth]
Nov 29 06:15:51 compute-1 sshd-session[72632]: Received disconnect from 118.194.230.250 port 49846:11: Bye Bye [preauth]
Nov 29 06:15:51 compute-1 sshd-session[72632]: Disconnected from authenticating user root 118.194.230.250 port 49846 [preauth]
Nov 29 06:17:03 compute-1 sshd-session[72634]: Invalid user userb from 45.55.249.98 port 51352
Nov 29 06:17:03 compute-1 sshd-session[72634]: Received disconnect from 45.55.249.98 port 51352:11: Bye Bye [preauth]
Nov 29 06:17:03 compute-1 sshd-session[72634]: Disconnected from invalid user userb 45.55.249.98 port 51352 [preauth]
Nov 29 06:17:11 compute-1 sshd-session[72636]: Received disconnect from 118.194.230.250 port 49948:11: Bye Bye [preauth]
Nov 29 06:17:11 compute-1 sshd-session[72636]: Disconnected from authenticating user root 118.194.230.250 port 49948 [preauth]
Nov 29 06:17:29 compute-1 sshd-session[72638]: Accepted publickey for ceph-admin from 192.168.122.100 port 47886 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:29 compute-1 systemd-logind[785]: New session 20 of user ceph-admin.
Nov 29 06:17:29 compute-1 systemd[1]: Created slice User Slice of UID 42477.
Nov 29 06:17:29 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42477...
Nov 29 06:17:29 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42477.
Nov 29 06:17:30 compute-1 systemd[1]: Starting User Manager for UID 42477...
Nov 29 06:17:30 compute-1 systemd[72642]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:30 compute-1 sshd-session[72648]: Accepted publickey for ceph-admin from 192.168.122.100 port 47894 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:30 compute-1 systemd-logind[785]: New session 22 of user ceph-admin.
Nov 29 06:17:30 compute-1 systemd[72642]: Queued start job for default target Main User Target.
Nov 29 06:17:30 compute-1 systemd[72642]: Created slice User Application Slice.
Nov 29 06:17:30 compute-1 systemd[72642]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 06:17:30 compute-1 systemd[72642]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 06:17:30 compute-1 systemd[72642]: Reached target Paths.
Nov 29 06:17:30 compute-1 systemd[72642]: Reached target Timers.
Nov 29 06:17:30 compute-1 systemd[72642]: Starting D-Bus User Message Bus Socket...
Nov 29 06:17:30 compute-1 systemd[72642]: Starting Create User's Volatile Files and Directories...
Nov 29 06:17:30 compute-1 systemd[72642]: Listening on D-Bus User Message Bus Socket.
Nov 29 06:17:30 compute-1 systemd[72642]: Reached target Sockets.
Nov 29 06:17:30 compute-1 systemd[72642]: Finished Create User's Volatile Files and Directories.
Nov 29 06:17:30 compute-1 systemd[72642]: Reached target Basic System.
Nov 29 06:17:30 compute-1 systemd[72642]: Reached target Main User Target.
Nov 29 06:17:30 compute-1 systemd[72642]: Startup finished in 171ms.
Nov 29 06:17:30 compute-1 systemd[1]: Started User Manager for UID 42477.
Nov 29 06:17:30 compute-1 systemd[1]: Started Session 20 of User ceph-admin.
Nov 29 06:17:30 compute-1 systemd[1]: Started Session 22 of User ceph-admin.
Nov 29 06:17:30 compute-1 sshd-session[72638]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:30 compute-1 sshd-session[72648]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:30 compute-1 sudo[72663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:30 compute-1 sudo[72663]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:30 compute-1 sudo[72663]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:30 compute-1 sudo[72688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:17:30 compute-1 sudo[72688]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:30 compute-1 sudo[72688]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:30 compute-1 sshd-session[72713]: Accepted publickey for ceph-admin from 192.168.122.100 port 47906 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:30 compute-1 systemd-logind[785]: New session 23 of user ceph-admin.
Nov 29 06:17:30 compute-1 systemd[1]: Started Session 23 of User ceph-admin.
Nov 29 06:17:30 compute-1 sshd-session[72713]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:30 compute-1 sudo[72717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:30 compute-1 sudo[72717]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:30 compute-1 sudo[72717]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:30 compute-1 sudo[72742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host --expect-hostname compute-1
Nov 29 06:17:30 compute-1 sudo[72742]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:30 compute-1 sudo[72742]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:31 compute-1 sshd-session[72767]: Accepted publickey for ceph-admin from 192.168.122.100 port 47908 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:31 compute-1 systemd-logind[785]: New session 24 of user ceph-admin.
Nov 29 06:17:31 compute-1 systemd[1]: Started Session 24 of User ceph-admin.
Nov 29 06:17:31 compute-1 sshd-session[72767]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:31 compute-1 sudo[72771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:31 compute-1 sudo[72771]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:31 compute-1 sudo[72771]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:31 compute-1 sudo[72796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d
Nov 29 06:17:31 compute-1 sudo[72796]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:31 compute-1 sudo[72796]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:31 compute-1 sshd-session[72821]: Accepted publickey for ceph-admin from 192.168.122.100 port 47922 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:31 compute-1 systemd-logind[785]: New session 25 of user ceph-admin.
Nov 29 06:17:31 compute-1 systemd[1]: Started Session 25 of User ceph-admin.
Nov 29 06:17:31 compute-1 sshd-session[72821]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:31 compute-1 sudo[72825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:31 compute-1 sudo[72825]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:31 compute-1 sudo[72825]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:31 compute-1 sudo[72850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:17:31 compute-1 sudo[72850]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:31 compute-1 sudo[72850]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:31 compute-1 sshd-session[72875]: Accepted publickey for ceph-admin from 192.168.122.100 port 47938 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:31 compute-1 systemd-logind[785]: New session 26 of user ceph-admin.
Nov 29 06:17:31 compute-1 systemd[1]: Started Session 26 of User ceph-admin.
Nov 29 06:17:32 compute-1 sshd-session[72875]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:32 compute-1 sudo[72879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:32 compute-1 sudo[72879]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:32 compute-1 sudo[72879]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:32 compute-1 sudo[72904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:17:32 compute-1 sudo[72904]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:32 compute-1 sudo[72904]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:32 compute-1 sshd-session[72929]: Accepted publickey for ceph-admin from 192.168.122.100 port 47940 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:32 compute-1 systemd-logind[785]: New session 27 of user ceph-admin.
Nov 29 06:17:32 compute-1 systemd[1]: Started Session 27 of User ceph-admin.
Nov 29 06:17:32 compute-1 sshd-session[72929]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:32 compute-1 sudo[72933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:32 compute-1 sudo[72933]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:32 compute-1 sudo[72933]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:32 compute-1 sudo[72958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d.new
Nov 29 06:17:32 compute-1 sudo[72958]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:32 compute-1 sudo[72958]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:32 compute-1 sshd-session[72983]: Accepted publickey for ceph-admin from 192.168.122.100 port 44116 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:32 compute-1 systemd-logind[785]: New session 28 of user ceph-admin.
Nov 29 06:17:32 compute-1 systemd[1]: Started Session 28 of User ceph-admin.
Nov 29 06:17:32 compute-1 sshd-session[72983]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:32 compute-1 sudo[72987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:32 compute-1 sudo[72987]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:32 compute-1 sudo[72987]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:33 compute-1 sudo[73012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:17:33 compute-1 sudo[73012]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:33 compute-1 sudo[73012]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:33 compute-1 sshd-session[73037]: Accepted publickey for ceph-admin from 192.168.122.100 port 44130 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:33 compute-1 systemd-logind[785]: New session 29 of user ceph-admin.
Nov 29 06:17:33 compute-1 systemd[1]: Started Session 29 of User ceph-admin.
Nov 29 06:17:33 compute-1 sshd-session[73037]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:33 compute-1 sudo[73041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:33 compute-1 sudo[73041]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:33 compute-1 sudo[73041]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:33 compute-1 sudo[73066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d.new
Nov 29 06:17:33 compute-1 sudo[73066]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:33 compute-1 sudo[73066]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:33 compute-1 sshd-session[73091]: Accepted publickey for ceph-admin from 192.168.122.100 port 44132 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:33 compute-1 systemd-logind[785]: New session 30 of user ceph-admin.
Nov 29 06:17:33 compute-1 systemd[1]: Started Session 30 of User ceph-admin.
Nov 29 06:17:33 compute-1 sshd-session[73091]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:34 compute-1 sshd-session[73118]: Accepted publickey for ceph-admin from 192.168.122.100 port 44140 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:34 compute-1 systemd-logind[785]: New session 31 of user ceph-admin.
Nov 29 06:17:34 compute-1 systemd[1]: Started Session 31 of User ceph-admin.
Nov 29 06:17:34 compute-1 sshd-session[73118]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:34 compute-1 sudo[73122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:34 compute-1 sudo[73122]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:34 compute-1 sudo[73122]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:34 compute-1 sudo[73147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d.new /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d
Nov 29 06:17:34 compute-1 sudo[73147]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:34 compute-1 sudo[73147]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:34 compute-1 sshd-session[73172]: Accepted publickey for ceph-admin from 192.168.122.100 port 44156 ssh2: RSA SHA256:wSO38gUigzg+3qmbq5ZCXhMSnm1ow+14BbAXfOugcIA
Nov 29 06:17:34 compute-1 systemd-logind[785]: New session 32 of user ceph-admin.
Nov 29 06:17:34 compute-1 systemd[1]: Started Session 32 of User ceph-admin.
Nov 29 06:17:34 compute-1 sshd-session[73172]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Nov 29 06:17:34 compute-1 sudo[73176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:34 compute-1 sudo[73176]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:34 compute-1 sudo[73176]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:34 compute-1 sudo[73201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host --expect-hostname compute-1
Nov 29 06:17:34 compute-1 sudo[73201]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:35 compute-1 sudo[73201]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:35 compute-1 sudo[73247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:35 compute-1 sudo[73247]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:35 compute-1 sudo[73247]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:35 compute-1 sudo[73272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:17:35 compute-1 sudo[73272]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:35 compute-1 sudo[73272]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:35 compute-1 sudo[73297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:35 compute-1 sudo[73297]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:35 compute-1 sudo[73297]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:35 compute-1 sudo[73322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Nov 29 06:17:35 compute-1 sudo[73322]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:36 compute-1 sudo[73322]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-1 sudo[73368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:36 compute-1 sudo[73368]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-1 sudo[73368]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-1 sudo[73393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:17:36 compute-1 sudo[73393]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-1 sudo[73393]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-1 sudo[73418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:36 compute-1 sudo[73418]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-1 sudo[73418]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-1 sudo[73443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:17:36 compute-1 sudo[73443]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:36 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:36 compute-1 sudo[73443]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-1 sudo[73506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:36 compute-1 sudo[73506]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-1 sudo[73506]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-1 sudo[73531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:17:36 compute-1 sudo[73531]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-1 sudo[73531]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:36 compute-1 sudo[73556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:36 compute-1 sudo[73556]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:36 compute-1 sudo[73556]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:37 compute-1 sudo[73581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:17:37 compute-1 sudo[73581]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:37 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73618 (sysctl)
Nov 29 06:17:37 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:37 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 29 06:17:37 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 29 06:17:37 compute-1 sudo[73581]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:37 compute-1 sudo[73640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:37 compute-1 sudo[73640]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:37 compute-1 sudo[73640]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-1 sudo[73665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:17:38 compute-1 sudo[73665]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-1 sudo[73665]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-1 sudo[73690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:38 compute-1 sudo[73690]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-1 sudo[73690]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-1 sudo[73715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 list-networks
Nov 29 06:17:38 compute-1 sudo[73715]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:38 compute-1 sudo[73715]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-1 sudo[73759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:38 compute-1 sudo[73759]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-1 sudo[73759]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-1 sudo[73784]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:17:38 compute-1 sudo[73784]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-1 sudo[73784]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-1 sudo[73809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:38 compute-1 sudo[73809]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-1 sudo[73809]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-1 sudo[73834]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- inventory --format=json-pretty --filter-for-batch
Nov 29 06:17:38 compute-1 sudo[73834]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:38 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:38 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:46 compute-1 podman[73895]: 2025-11-29 06:17:46.990807249 +0000 UTC m=+8.036427383 image pull-error  quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 unable to copy from source docker://quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0: copying system image from manifest list: reading blob sha256:5d6e359445031266a061adaf2d66bc7e110161eb2d4cc1c20df0b7b391e2e65a: Get "https://cdn01.quay.io/quayio-production-s3/sha256/5d/5d6e359445031266a061adaf2d66bc7e110161eb2d4cc1c20df0b7b391e2e65a?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251129%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251129T061739Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=f2c8d0740ca5e417432f59f87b3e1a078374b58c1ece456a50051e5535d22395&region=us-east-1&namespace=ceph&repo_name=ceph&akamai_signature=exp=1764397959~hmac=489e02c9402b7ae0da17693faba5808124f30e4a58d26632cd42db255b5697df": remote error: tls: internal error
Nov 29 06:17:46 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:47 compute-1 sudo[73834]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:47 compute-1 sudo[73914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:47 compute-1 sudo[73914]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:47 compute-1 sudo[73914]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:47 compute-1 sudo[73939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 29 06:17:47 compute-1 sudo[73939]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:47 compute-1 sudo[73939]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:47 compute-1 sudo[73964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:47 compute-1 sudo[73964]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:47 compute-1 sudo[73964]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:47 compute-1 sudo[73989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph
Nov 29 06:17:47 compute-1 sudo[73989]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:47 compute-1 sudo[73989]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:47 compute-1 sudo[74014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:47 compute-1 sudo[74014]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:47 compute-1 sudo[74014]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:47 compute-1 sudo[74039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:17:47 compute-1 sudo[74039]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:47 compute-1 sudo[74039]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:47 compute-1 sudo[74064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:47 compute-1 sudo[74064]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:47 compute-1 sudo[74064]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:47 compute-1 sudo[74089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:17:47 compute-1 sudo[74089]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:47 compute-1 sudo[74089]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:47 compute-1 sudo[74114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:47 compute-1 sudo[74114]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:47 compute-1 sudo[74114]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:47 compute-1 sudo[74139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:17:47 compute-1 sudo[74139]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:47 compute-1 sudo[74139]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:47 compute-1 sudo[74187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:47 compute-1 sudo[74187]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:47 compute-1 sudo[74187]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:17:48 compute-1 sudo[74212]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74212]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:48 compute-1 sudo[74237]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74237]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:17:48 compute-1 sudo[74262]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74262]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:48 compute-1 sudo[74287]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74287]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 29 06:17:48 compute-1 sudo[74312]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74312]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:48 compute-1 sudo[74337]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74337]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:17:48 compute-1 sudo[74362]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74362]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:48 compute-1 sudo[74387]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74387]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:17:48 compute-1 sudo[74412]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74412]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:48 compute-1 sudo[74437]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74437]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:17:48 compute-1 sudo[74462]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74462]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:48 compute-1 sudo[74487]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74487]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:48 compute-1 sudo[74512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:17:48 compute-1 sudo[74512]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:48 compute-1 sudo[74512]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:49 compute-1 sudo[74537]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74537]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:17:49 compute-1 sudo[74562]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74562]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:49 compute-1 sudo[74610]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74610]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:17:49 compute-1 sudo[74635]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74635]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:49 compute-1 sudo[74660]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74660]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:17:49 compute-1 sudo[74685]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74685]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:49 compute-1 sudo[74710]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74710]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:17:49 compute-1 sudo[74735]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74735]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:49 compute-1 sudo[74760]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74760]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 29 06:17:49 compute-1 sudo[74785]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74785]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:49 compute-1 sudo[74810]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74810]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph
Nov 29 06:17:49 compute-1 sudo[74835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74835]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:49 compute-1 sudo[74860]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74860]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:49 compute-1 sudo[74885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.client.admin.keyring.new
Nov 29 06:17:49 compute-1 sudo[74885]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:49 compute-1 sudo[74885]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[74910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:50 compute-1 sudo[74910]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[74910]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[74935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:17:50 compute-1 sudo[74935]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[74935]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[74960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:50 compute-1 sudo[74960]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[74960]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[74985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.client.admin.keyring.new
Nov 29 06:17:50 compute-1 sudo[74985]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[74985]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[75033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:50 compute-1 sudo[75033]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[75033]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[75058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.client.admin.keyring.new
Nov 29 06:17:50 compute-1 sudo[75058]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[75058]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[75083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:50 compute-1 sudo[75083]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[75083]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[75108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.client.admin.keyring.new
Nov 29 06:17:50 compute-1 sudo[75108]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[75108]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[75133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:50 compute-1 sudo[75133]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[75133]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[75158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 29 06:17:50 compute-1 sudo[75158]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[75158]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[75183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:50 compute-1 sudo[75183]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[75183]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[75208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:17:50 compute-1 sudo[75208]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[75208]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:50 compute-1 sudo[75233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:50 compute-1 sudo[75233]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:50 compute-1 sudo[75233]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:17:51 compute-1 sudo[75258]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75258]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:51 compute-1 sudo[75283]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75283]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring.new
Nov 29 06:17:51 compute-1 sudo[75308]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75308]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:51 compute-1 sudo[75333]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75333]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:17:51 compute-1 sudo[75358]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75358]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:51 compute-1 sudo[75383]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75383]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring.new
Nov 29 06:17:51 compute-1 sudo[75408]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75408]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:51 compute-1 sudo[75456]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75456]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring.new
Nov 29 06:17:51 compute-1 sudo[75481]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75481]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:51 compute-1 sudo[75506]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75506]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring.new
Nov 29 06:17:51 compute-1 sudo[75531]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75531]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:51 compute-1 sudo[75556]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75556]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring.new /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring
Nov 29 06:17:51 compute-1 sudo[75581]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75581]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-1 sudo[75606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:51 compute-1 sudo[75606]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:51 compute-1 sudo[75606]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:52 compute-1 sudo[75631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:17:52 compute-1 sudo[75631]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:52 compute-1 sudo[75631]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:52 compute-1 sudo[75656]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:17:52 compute-1 sudo[75656]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:52 compute-1 sudo[75656]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:52 compute-1 sudo[75681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:17:52 compute-1 sudo[75681]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:17:52 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:52 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:17:54 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat3728586434-merged.mount: Deactivated successfully.
Nov 29 06:17:54 compute-1 sshd-session[75771]: Invalid user user1 from 71.70.164.48 port 51421
Nov 29 06:17:54 compute-1 sshd-session[75771]: Received disconnect from 71.70.164.48 port 51421:11: Bye Bye [preauth]
Nov 29 06:17:54 compute-1 sshd-session[75771]: Disconnected from invalid user user1 71.70.164.48 port 51421 [preauth]
Nov 29 06:17:54 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat3728586434-lower\x2dmapped.mount: Deactivated successfully.
Nov 29 06:18:16 compute-1 podman[75746]: 2025-11-29 06:18:16.408385917 +0000 UTC m=+23.966234202 container create f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Nov 29 06:18:16 compute-1 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck3820584154-merged.mount: Deactivated successfully.
Nov 29 06:18:16 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 29 06:18:16 compute-1 systemd[1]: Started libpod-conmon-f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96.scope.
Nov 29 06:18:16 compute-1 podman[75746]: 2025-11-29 06:18:16.388224989 +0000 UTC m=+23.946073294 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:16 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:16 compute-1 podman[75746]: 2025-11-29 06:18:16.526671639 +0000 UTC m=+24.084519984 container init f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Nov 29 06:18:16 compute-1 podman[75746]: 2025-11-29 06:18:16.538702721 +0000 UTC m=+24.096551046 container start f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:18:16 compute-1 podman[75746]: 2025-11-29 06:18:16.543640118 +0000 UTC m=+24.101488443 container attach f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:18:16 compute-1 reverent_keller[75815]: 167 167
Nov 29 06:18:16 compute-1 systemd[1]: libpod-f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96.scope: Deactivated successfully.
Nov 29 06:18:16 compute-1 podman[75746]: 2025-11-29 06:18:16.549063918 +0000 UTC m=+24.106912223 container died f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:18:16 compute-1 systemd[1]: var-lib-containers-storage-overlay-f5cad89f6770416eba1342b8ee95aa48dbbe36647ee73ef4b65e74454b2515db-merged.mount: Deactivated successfully.
Nov 29 06:18:16 compute-1 podman[75746]: 2025-11-29 06:18:16.600769508 +0000 UTC m=+24.158617803 container remove f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:18:16 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:18:16 compute-1 systemd[1]: libpod-conmon-f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96.scope: Deactivated successfully.
Nov 29 06:18:16 compute-1 systemd[1]: Reloading.
Nov 29 06:18:16 compute-1 systemd-sysv-generator[75866]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:18:16 compute-1 systemd-rc-local-generator[75861]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:18:16 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:18:16 compute-1 systemd[1]: Reloading.
Nov 29 06:18:17 compute-1 systemd-rc-local-generator[75898]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:18:17 compute-1 systemd-sysv-generator[75901]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:18:17 compute-1 systemd[1]: Reached target All Ceph clusters and services.
Nov 29 06:18:17 compute-1 systemd[1]: Reloading.
Nov 29 06:18:17 compute-1 systemd-sysv-generator[75942]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:18:17 compute-1 systemd-rc-local-generator[75937]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:18:17 compute-1 systemd[1]: Reached target Ceph cluster 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:18:17 compute-1 systemd[1]: Reloading.
Nov 29 06:18:17 compute-1 systemd-sysv-generator[75981]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:18:17 compute-1 systemd-rc-local-generator[75976]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:18:17 compute-1 systemd[1]: Reloading.
Nov 29 06:18:17 compute-1 systemd-rc-local-generator[76018]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:18:17 compute-1 systemd-sysv-generator[76021]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:18:18 compute-1 systemd[1]: Created slice Slice /system/ceph-336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:18:18 compute-1 systemd[1]: Reached target System Time Set.
Nov 29 06:18:18 compute-1 systemd[1]: Reached target System Time Synchronized.
Nov 29 06:18:18 compute-1 systemd[1]: Starting Ceph crash.compute-1 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:18:18 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:18:18 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:18:18 compute-1 podman[76076]: 2025-11-29 06:18:18.389064124 +0000 UTC m=+0.057101141 container create 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Nov 29 06:18:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f781cdad64ba65b454ab70e9b45e6bf8cfd68f347c8f32c578688a8e4d7b89b7/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f781cdad64ba65b454ab70e9b45e6bf8cfd68f347c8f32c578688a8e4d7b89b7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f781cdad64ba65b454ab70e9b45e6bf8cfd68f347c8f32c578688a8e4d7b89b7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:18 compute-1 podman[76076]: 2025-11-29 06:18:18.448740364 +0000 UTC m=+0.116777401 container init 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 06:18:18 compute-1 podman[76076]: 2025-11-29 06:18:18.453187868 +0000 UTC m=+0.121224855 container start 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:18:18 compute-1 bash[76076]: 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6
Nov 29 06:18:18 compute-1 podman[76076]: 2025-11-29 06:18:18.369091641 +0000 UTC m=+0.037128628 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:18 compute-1 systemd[1]: Started Ceph crash.compute-1 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:18:18 compute-1 sudo[75681]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:18 compute-1 sudo[76096]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:18 compute-1 sudo[76096]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:18 compute-1 sudo[76096]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:18 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: INFO:ceph-crash:pinging cluster to exercise our key
Nov 29 06:18:18 compute-1 sudo[76121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:18 compute-1 sudo[76121]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:18 compute-1 sudo[76121]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:18 compute-1 sudo[76148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:18 compute-1 sudo[76148]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:18 compute-1 sudo[76148]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:18 compute-1 sudo[76173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Nov 29 06:18:18 compute-1 sudo[76173]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:18 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.851+0000 7fd226a7b640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 29 06:18:18 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.851+0000 7fd226a7b640 -1 AuthRegistry(0x7fd220066fe0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 29 06:18:18 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.853+0000 7fd226a7b640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 29 06:18:18 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.853+0000 7fd226a7b640 -1 AuthRegistry(0x7fd226a7a000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 29 06:18:18 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.856+0000 7fd21ffff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 29 06:18:18 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.856+0000 7fd226a7b640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Nov 29 06:18:18 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: [errno 13] RADOS permission denied (error connecting to the cluster)
Nov 29 06:18:18 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Nov 29 06:18:19 compute-1 podman[76247]: 2025-11-29 06:18:19.117693068 +0000 UTC m=+0.055146186 container create eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:18:19 compute-1 systemd[1]: Started libpod-conmon-eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd.scope.
Nov 29 06:18:19 compute-1 podman[76247]: 2025-11-29 06:18:19.086899716 +0000 UTC m=+0.024352884 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:19 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:19 compute-1 podman[76247]: 2025-11-29 06:18:19.217589601 +0000 UTC m=+0.155042719 container init eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:18:19 compute-1 podman[76247]: 2025-11-29 06:18:19.231745753 +0000 UTC m=+0.169198871 container start eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 29 06:18:19 compute-1 podman[76247]: 2025-11-29 06:18:19.236317129 +0000 UTC m=+0.173770217 container attach eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:18:19 compute-1 keen_cori[76263]: 167 167
Nov 29 06:18:19 compute-1 systemd[1]: libpod-eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd.scope: Deactivated successfully.
Nov 29 06:18:19 compute-1 conmon[76263]: conmon eecf89c619bd173bf00c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd.scope/container/memory.events
Nov 29 06:18:19 compute-1 podman[76247]: 2025-11-29 06:18:19.239883467 +0000 UTC m=+0.177336585 container died eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:18:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-1c0c7a7c932ef8ca8f43a9e68cec7183d38a0b38892b0d000e58a72a4d9ad42d-merged.mount: Deactivated successfully.
Nov 29 06:18:19 compute-1 podman[76247]: 2025-11-29 06:18:19.294463997 +0000 UTC m=+0.231917115 container remove eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:18:19 compute-1 systemd[1]: libpod-conmon-eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd.scope: Deactivated successfully.
Nov 29 06:18:19 compute-1 podman[76288]: 2025-11-29 06:18:19.541726917 +0000 UTC m=+0.079134691 container create 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:18:19 compute-1 systemd[1]: Started libpod-conmon-4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2.scope.
Nov 29 06:18:19 compute-1 podman[76288]: 2025-11-29 06:18:19.506676367 +0000 UTC m=+0.044084181 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:19 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:19 compute-1 podman[76288]: 2025-11-29 06:18:19.653484318 +0000 UTC m=+0.190892122 container init 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Nov 29 06:18:19 compute-1 podman[76288]: 2025-11-29 06:18:19.668669589 +0000 UTC m=+0.206077363 container start 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:18:19 compute-1 podman[76288]: 2025-11-29 06:18:19.674691865 +0000 UTC m=+0.212099629 container attach 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 29 06:18:20 compute-1 stupefied_noether[76304]: --> passed data devices: 0 physical, 1 LVM
Nov 29 06:18:20 compute-1 stupefied_noether[76304]: --> relative data size: 1.0
Nov 29 06:18:20 compute-1 stupefied_noether[76304]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 29 06:18:20 compute-1 stupefied_noether[76304]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new f793b967-de22-4105-bb0d-c91464bf150f
Nov 29 06:18:21 compute-1 stupefied_noether[76304]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 29 06:18:21 compute-1 stupefied_noether[76304]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Nov 29 06:18:21 compute-1 stupefied_noether[76304]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Nov 29 06:18:21 compute-1 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 06:18:21 compute-1 stupefied_noether[76304]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:21 compute-1 lvm[76352]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 06:18:21 compute-1 lvm[76352]: VG ceph_vg0 finished
Nov 29 06:18:21 compute-1 stupefied_noether[76304]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Nov 29 06:18:21 compute-1 stupefied_noether[76304]:  stderr: got monmap epoch 1
Nov 29 06:18:21 compute-1 stupefied_noether[76304]: --> Creating keyring file for osd.0
Nov 29 06:18:21 compute-1 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Nov 29 06:18:21 compute-1 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Nov 29 06:18:21 compute-1 stupefied_noether[76304]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid f793b967-de22-4105-bb0d-c91464bf150f --setuser ceph --setgroup ceph
Nov 29 06:18:23 compute-1 sshd-session[76426]: Received disconnect from 45.55.249.98 port 34482:11: Bye Bye [preauth]
Nov 29 06:18:23 compute-1 sshd-session[76426]: Disconnected from authenticating user root 45.55.249.98 port 34482 [preauth]
Nov 29 06:18:24 compute-1 stupefied_noether[76304]:  stderr: 2025-11-29T06:18:21.826+0000 7fb6d587f740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 06:18:24 compute-1 stupefied_noether[76304]:  stderr: 2025-11-29T06:18:21.827+0000 7fb6d587f740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 06:18:24 compute-1 stupefied_noether[76304]:  stderr: 2025-11-29T06:18:21.827+0000 7fb6d587f740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 06:18:24 compute-1 stupefied_noether[76304]:  stderr: 2025-11-29T06:18:21.827+0000 7fb6d587f740 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Nov 29 06:18:24 compute-1 stupefied_noether[76304]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Nov 29 06:18:25 compute-1 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 06:18:25 compute-1 stupefied_noether[76304]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Nov 29 06:18:25 compute-1 stupefied_noether[76304]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:25 compute-1 stupefied_noether[76304]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:25 compute-1 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 06:18:25 compute-1 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 06:18:25 compute-1 stupefied_noether[76304]: --> ceph-volume lvm activate successful for osd ID: 0
Nov 29 06:18:25 compute-1 stupefied_noether[76304]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Nov 29 06:18:25 compute-1 systemd[1]: libpod-4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2.scope: Deactivated successfully.
Nov 29 06:18:25 compute-1 systemd[1]: libpod-4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2.scope: Consumed 2.543s CPU time.
Nov 29 06:18:25 compute-1 podman[76288]: 2025-11-29 06:18:25.17266064 +0000 UTC m=+5.710068424 container died 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:18:25 compute-1 systemd[1]: var-lib-containers-storage-overlay-6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260-merged.mount: Deactivated successfully.
Nov 29 06:18:26 compute-1 podman[76288]: 2025-11-29 06:18:26.013482478 +0000 UTC m=+6.550890252 container remove 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:18:26 compute-1 systemd[1]: libpod-conmon-4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2.scope: Deactivated successfully.
Nov 29 06:18:26 compute-1 sudo[76173]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:26 compute-1 sudo[77285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:26 compute-1 sudo[77285]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:26 compute-1 sudo[77285]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:26 compute-1 sudo[77310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:26 compute-1 sudo[77310]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:26 compute-1 sudo[77310]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:26 compute-1 sudo[77335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:26 compute-1 sudo[77335]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:26 compute-1 sudo[77335]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:26 compute-1 sudo[77360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- lvm list --format json
Nov 29 06:18:26 compute-1 sudo[77360]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:26 compute-1 podman[77424]: 2025-11-29 06:18:26.743262048 +0000 UTC m=+0.026669833 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:26 compute-1 podman[77424]: 2025-11-29 06:18:26.884457687 +0000 UTC m=+0.167865472 container create 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Nov 29 06:18:26 compute-1 systemd[1]: Started libpod-conmon-730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874.scope.
Nov 29 06:18:26 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:27 compute-1 podman[77424]: 2025-11-29 06:18:27.005128489 +0000 UTC m=+0.288536284 container init 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507)
Nov 29 06:18:27 compute-1 podman[77424]: 2025-11-29 06:18:27.018608139 +0000 UTC m=+0.302015924 container start 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Nov 29 06:18:27 compute-1 recursing_herschel[77441]: 167 167
Nov 29 06:18:27 compute-1 systemd[1]: libpod-730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874.scope: Deactivated successfully.
Nov 29 06:18:27 compute-1 podman[77424]: 2025-11-29 06:18:27.098653638 +0000 UTC m=+0.382061463 container attach 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Nov 29 06:18:27 compute-1 podman[77424]: 2025-11-29 06:18:27.099444046 +0000 UTC m=+0.382851831 container died 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:18:27 compute-1 systemd[1]: var-lib-containers-storage-overlay-021639303c70fff0e21c2d8bfb8c23dd309bdd38bbb3c9f264dbcbee9cf71e30-merged.mount: Deactivated successfully.
Nov 29 06:18:27 compute-1 podman[77424]: 2025-11-29 06:18:27.510664496 +0000 UTC m=+0.794072271 container remove 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Nov 29 06:18:27 compute-1 systemd[1]: libpod-conmon-730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874.scope: Deactivated successfully.
Nov 29 06:18:27 compute-1 podman[77468]: 2025-11-29 06:18:27.773673362 +0000 UTC m=+0.092352134 container create 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:18:27 compute-1 podman[77468]: 2025-11-29 06:18:27.709200589 +0000 UTC m=+0.027879381 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:27 compute-1 systemd[1]: Started libpod-conmon-140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1.scope.
Nov 29 06:18:27 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c99024dbff69317d78cf0765d176e8f92aa5e050e65b73caace64ceee49dd7a8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c99024dbff69317d78cf0765d176e8f92aa5e050e65b73caace64ceee49dd7a8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c99024dbff69317d78cf0765d176e8f92aa5e050e65b73caace64ceee49dd7a8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c99024dbff69317d78cf0765d176e8f92aa5e050e65b73caace64ceee49dd7a8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:27 compute-1 podman[77468]: 2025-11-29 06:18:27.935380966 +0000 UTC m=+0.254059798 container init 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Nov 29 06:18:27 compute-1 podman[77468]: 2025-11-29 06:18:27.949232684 +0000 UTC m=+0.267911456 container start 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Nov 29 06:18:27 compute-1 podman[77468]: 2025-11-29 06:18:27.953209703 +0000 UTC m=+0.271888465 container attach 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Nov 29 06:18:28 compute-1 trusting_bassi[77484]: {
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:     "0": [
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:         {
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             "devices": [
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "/dev/loop3"
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             ],
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             "lv_name": "ceph_lv0",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             "lv_size": "7511998464",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=336ec58c-893b-528f-a0c1-6ed1196bc047,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=f793b967-de22-4105-bb0d-c91464bf150f,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             "lv_uuid": "caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             "name": "ceph_lv0",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             "path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             "tags": {
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.block_uuid": "caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.cephx_lockbox_secret": "",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.cluster_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.cluster_name": "ceph",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.crush_device_class": "",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.encrypted": "0",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.osd_fsid": "f793b967-de22-4105-bb0d-c91464bf150f",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.osd_id": "0",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.osdspec_affinity": "default_drive_group",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.type": "block",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:                 "ceph.vdo": "0"
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             },
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             "type": "block",
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:             "vg_name": "ceph_vg0"
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:         }
Nov 29 06:18:28 compute-1 trusting_bassi[77484]:     ]
Nov 29 06:18:28 compute-1 trusting_bassi[77484]: }
Nov 29 06:18:28 compute-1 systemd[1]: libpod-140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1.scope: Deactivated successfully.
Nov 29 06:18:28 compute-1 podman[77468]: 2025-11-29 06:18:28.781945642 +0000 UTC m=+1.100624384 container died 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Nov 29 06:18:29 compute-1 systemd[1]: var-lib-containers-storage-overlay-c99024dbff69317d78cf0765d176e8f92aa5e050e65b73caace64ceee49dd7a8-merged.mount: Deactivated successfully.
Nov 29 06:18:29 compute-1 podman[77468]: 2025-11-29 06:18:29.259051257 +0000 UTC m=+1.577729999 container remove 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Nov 29 06:18:29 compute-1 sudo[77360]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:29 compute-1 systemd[1]: libpod-conmon-140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1.scope: Deactivated successfully.
Nov 29 06:18:29 compute-1 sudo[77505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:29 compute-1 sudo[77505]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:29 compute-1 sudo[77505]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:29 compute-1 sudo[77530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:29 compute-1 sudo[77530]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:29 compute-1 sudo[77530]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:29 compute-1 sudo[77555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:29 compute-1 sudo[77555]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:29 compute-1 sudo[77555]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:29 compute-1 sudo[77580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:18:29 compute-1 sudo[77580]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:30 compute-1 podman[77646]: 2025-11-29 06:18:30.098540136 +0000 UTC m=+0.032022523 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:30 compute-1 podman[77646]: 2025-11-29 06:18:30.329569421 +0000 UTC m=+0.263051748 container create 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Nov 29 06:18:30 compute-1 systemd[1]: Started libpod-conmon-00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15.scope.
Nov 29 06:18:30 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:30 compute-1 podman[77646]: 2025-11-29 06:18:30.506948074 +0000 UTC m=+0.440430371 container init 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:18:30 compute-1 podman[77646]: 2025-11-29 06:18:30.515248918 +0000 UTC m=+0.448731205 container start 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Nov 29 06:18:30 compute-1 boring_leakey[77663]: 167 167
Nov 29 06:18:30 compute-1 systemd[1]: libpod-00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15.scope: Deactivated successfully.
Nov 29 06:18:30 compute-1 podman[77646]: 2025-11-29 06:18:30.565568046 +0000 UTC m=+0.499050363 container attach 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:18:30 compute-1 podman[77646]: 2025-11-29 06:18:30.566113419 +0000 UTC m=+0.499595736 container died 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:18:30 compute-1 systemd[1]: var-lib-containers-storage-overlay-39a1f0a4234a2a43258903052215cc9e4841dbfd9c0e17576e3af253c4269edf-merged.mount: Deactivated successfully.
Nov 29 06:18:30 compute-1 podman[77646]: 2025-11-29 06:18:30.615376584 +0000 UTC m=+0.548858891 container remove 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:18:30 compute-1 systemd[1]: libpod-conmon-00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15.scope: Deactivated successfully.
Nov 29 06:18:30 compute-1 sshd-session[77632]: Invalid user jose from 118.194.230.250 port 50048
Nov 29 06:18:30 compute-1 podman[77696]: 2025-11-29 06:18:30.964851972 +0000 UTC m=+0.052864737 container create 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 29 06:18:31 compute-1 systemd[1]: Started libpod-conmon-89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237.scope.
Nov 29 06:18:31 compute-1 podman[77696]: 2025-11-29 06:18:30.945050691 +0000 UTC m=+0.033063466 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:31 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:31 compute-1 sshd-session[77632]: Received disconnect from 118.194.230.250 port 50048:11: Bye Bye [preauth]
Nov 29 06:18:31 compute-1 sshd-session[77632]: Disconnected from invalid user jose 118.194.230.250 port 50048 [preauth]
Nov 29 06:18:31 compute-1 podman[77696]: 2025-11-29 06:18:31.090032584 +0000 UTC m=+0.178045369 container init 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Nov 29 06:18:31 compute-1 podman[77696]: 2025-11-29 06:18:31.106386218 +0000 UTC m=+0.194398993 container start 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:18:31 compute-1 podman[77696]: 2025-11-29 06:18:31.110982009 +0000 UTC m=+0.198994774 container attach 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:18:31 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test[77713]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Nov 29 06:18:31 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test[77713]:                             [--no-systemd] [--no-tmpfs]
Nov 29 06:18:31 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test[77713]: ceph-volume activate: error: unrecognized arguments: --bad-option
Nov 29 06:18:31 compute-1 systemd[1]: libpod-89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237.scope: Deactivated successfully.
Nov 29 06:18:31 compute-1 podman[77696]: 2025-11-29 06:18:31.808104904 +0000 UTC m=+0.896117669 container died 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS)
Nov 29 06:18:31 compute-1 systemd[1]: var-lib-containers-storage-overlay-6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520-merged.mount: Deactivated successfully.
Nov 29 06:18:31 compute-1 podman[77696]: 2025-11-29 06:18:31.876810381 +0000 UTC m=+0.964823126 container remove 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:18:31 compute-1 systemd[1]: libpod-conmon-89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237.scope: Deactivated successfully.
Nov 29 06:18:32 compute-1 systemd[1]: Reloading.
Nov 29 06:18:32 compute-1 systemd-sysv-generator[77779]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:18:32 compute-1 systemd-rc-local-generator[77775]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:18:32 compute-1 systemd[1]: Reloading.
Nov 29 06:18:32 compute-1 systemd-rc-local-generator[77814]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:18:32 compute-1 systemd-sysv-generator[77819]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:18:32 compute-1 systemd[1]: Starting Ceph osd.0 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:18:33 compute-1 podman[77873]: 2025-11-29 06:18:33.056007701 +0000 UTC m=+0.061149881 container create 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:18:33 compute-1 podman[77873]: 2025-11-29 06:18:33.030492994 +0000 UTC m=+0.035635184 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:33 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:33 compute-1 podman[77873]: 2025-11-29 06:18:33.152815963 +0000 UTC m=+0.157958183 container init 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:18:33 compute-1 podman[77873]: 2025-11-29 06:18:33.166427895 +0000 UTC m=+0.171570035 container start 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 06:18:33 compute-1 podman[77873]: 2025-11-29 06:18:33.170900815 +0000 UTC m=+0.176042985 container attach 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Nov 29 06:18:34 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 06:18:34 compute-1 bash[77873]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 06:18:34 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 06:18:34 compute-1 bash[77873]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 06:18:34 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 06:18:34 compute-1 bash[77873]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 06:18:34 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 06:18:34 compute-1 bash[77873]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 06:18:34 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:34 compute-1 bash[77873]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:34 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 06:18:34 compute-1 bash[77873]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 06:18:34 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: --> ceph-volume raw activate successful for osd ID: 0
Nov 29 06:18:34 compute-1 bash[77873]: --> ceph-volume raw activate successful for osd ID: 0
Nov 29 06:18:34 compute-1 systemd[1]: libpod-736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a.scope: Deactivated successfully.
Nov 29 06:18:34 compute-1 systemd[1]: libpod-736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a.scope: Consumed 1.103s CPU time.
Nov 29 06:18:34 compute-1 podman[78008]: 2025-11-29 06:18:34.297774501 +0000 UTC m=+0.029999257 container died 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Nov 29 06:18:34 compute-1 systemd[1]: var-lib-containers-storage-overlay-7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d-merged.mount: Deactivated successfully.
Nov 29 06:18:34 compute-1 podman[78008]: 2025-11-29 06:18:34.367642485 +0000 UTC m=+0.099867141 container remove 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Nov 29 06:18:34 compute-1 podman[78069]: 2025-11-29 06:18:34.602058104 +0000 UTC m=+0.047562398 container create 142ead126c9af06537b8a836b8bfc0c1c7aa78776cf87cddc2c00f33cd39daba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Nov 29 06:18:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a1ab998af2835240f03fd4bff908c647865d353b867edf3d8f61e9f5621b998/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a1ab998af2835240f03fd4bff908c647865d353b867edf3d8f61e9f5621b998/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a1ab998af2835240f03fd4bff908c647865d353b867edf3d8f61e9f5621b998/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a1ab998af2835240f03fd4bff908c647865d353b867edf3d8f61e9f5621b998/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a1ab998af2835240f03fd4bff908c647865d353b867edf3d8f61e9f5621b998/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:34 compute-1 podman[78069]: 2025-11-29 06:18:34.582731225 +0000 UTC m=+0.028235519 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:34 compute-1 podman[78069]: 2025-11-29 06:18:34.697731742 +0000 UTC m=+0.143236036 container init 142ead126c9af06537b8a836b8bfc0c1c7aa78776cf87cddc2c00f33cd39daba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Nov 29 06:18:34 compute-1 podman[78069]: 2025-11-29 06:18:34.714417012 +0000 UTC m=+0.159921286 container start 142ead126c9af06537b8a836b8bfc0c1c7aa78776cf87cddc2c00f33cd39daba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:18:34 compute-1 bash[78069]: 142ead126c9af06537b8a836b8bfc0c1c7aa78776cf87cddc2c00f33cd39daba
Nov 29 06:18:34 compute-1 systemd[1]: Started Ceph osd.0 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:18:34 compute-1 ceph-osd[78089]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:18:34 compute-1 ceph-osd[78089]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Nov 29 06:18:34 compute-1 ceph-osd[78089]: pidfile_write: ignore empty --pid-file
Nov 29 06:18:34 compute-1 ceph-osd[78089]: bdev(0x5566c6f69800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:34 compute-1 sudo[77580]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:34 compute-1 ceph-osd[78089]: bdev(0x5566c6f69800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 06:18:34 compute-1 ceph-osd[78089]: bdev(0x5566c6f69800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:18:34 compute-1 ceph-osd[78089]: bdev(0x5566c6f69800 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:18:34 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 06:18:34 compute-1 ceph-osd[78089]: bdev(0x5566c7da1800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:34 compute-1 ceph-osd[78089]: bdev(0x5566c7da1800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 06:18:34 compute-1 ceph-osd[78089]: bdev(0x5566c7da1800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:18:34 compute-1 ceph-osd[78089]: bdev(0x5566c7da1800 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:18:34 compute-1 ceph-osd[78089]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Nov 29 06:18:34 compute-1 ceph-osd[78089]: bdev(0x5566c7da1800 /var/lib/ceph/osd/ceph-0/block) close
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6f69800 /var/lib/ceph/osd/ceph-0/block) close
Nov 29 06:18:35 compute-1 sudo[78104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:35 compute-1 sudo[78104]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:35 compute-1 sudo[78104]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:35 compute-1 sudo[78129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:35 compute-1 sudo[78129]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:35 compute-1 sudo[78129]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:35 compute-1 ceph-osd[78089]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Nov 29 06:18:35 compute-1 ceph-osd[78089]: load: jerasure load: lrc 
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) close
Nov 29 06:18:35 compute-1 sudo[78154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:35 compute-1 sudo[78154]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:35 compute-1 sudo[78154]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:35 compute-1 sudo[78184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- raw list --format json
Nov 29 06:18:35 compute-1 sudo[78184]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) close
Nov 29 06:18:35 compute-1 ceph-osd[78089]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Nov 29 06:18:35 compute-1 ceph-osd[78089]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluefs mount
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluefs mount shared_bdev_used = 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: RocksDB version: 7.9.2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Git sha 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: DB SUMMARY
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: DB Session ID:  YT76S9WB35YQ4FZZK94N
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: CURRENT file:  CURRENT
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                         Options.error_if_exists: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.create_if_missing: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                                     Options.env: 0x5566c7df3d50
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                                Options.info_log: 0x5566c6ff4ba0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                              Options.statistics: (nil)
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.use_fsync: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                              Options.db_log_dir: 
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                                 Options.wal_dir: db.wal
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.write_buffer_manager: 0x5566c7f04460
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.unordered_write: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.row_cache: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                              Options.wal_filter: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.two_write_queues: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.wal_compression: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.atomic_flush: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.max_background_jobs: 4
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.max_background_compactions: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.max_subcompactions: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.max_open_files: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Compression algorithms supported:
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         kZSTD supported: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         kXpressCompression supported: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         kBZip2Compression supported: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         kLZ4Compression supported: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         kZlibCompression supported: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         kLZ4HCCompression supported: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         kSnappyCompression supported: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc430
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc430
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc430
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc430
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 podman[78252]: 2025-11-29 06:18:35.883948907 +0000 UTC m=+0.125330437 container create 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc430
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc430
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:35 compute-1 podman[78252]: 2025-11-29 06:18:35.796641576 +0000 UTC m=+0.038023146 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc430
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5200)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc2d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5200)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc2d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5200)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc2d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ba850913-8330-42fb-9d78-1800ad716abe
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397115892448, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397115892720, "job": 1, "event": "recovery_finished"}
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Nov 29 06:18:35 compute-1 ceph-osd[78089]: freelist init
Nov 29 06:18:35 compute-1 ceph-osd[78089]: freelist _read_cfg
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Nov 29 06:18:35 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bluefs umount
Nov 29 06:18:35 compute-1 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) close
Nov 29 06:18:35 compute-1 systemd[1]: Started libpod-conmon-7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47.scope.
Nov 29 06:18:35 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:35 compute-1 podman[78252]: 2025-11-29 06:18:35.996888427 +0000 UTC m=+0.238269977 container init 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:18:36 compute-1 podman[78252]: 2025-11-29 06:18:36.007214367 +0000 UTC m=+0.248595907 container start 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:18:36 compute-1 podman[78252]: 2025-11-29 06:18:36.012649548 +0000 UTC m=+0.254031168 container attach 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:18:36 compute-1 sharp_morse[78462]: 167 167
Nov 29 06:18:36 compute-1 systemd[1]: libpod-7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47.scope: Deactivated successfully.
Nov 29 06:18:36 compute-1 podman[78252]: 2025-11-29 06:18:36.01547748 +0000 UTC m=+0.256859050 container died 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Nov 29 06:18:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-4d8e2971b3b19a69b047dac4446fb5bfe70bddbfe1e367571c85c15542a21be2-merged.mount: Deactivated successfully.
Nov 29 06:18:36 compute-1 podman[78252]: 2025-11-29 06:18:36.061849291 +0000 UTC m=+0.303230821 container remove 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:18:36 compute-1 systemd[1]: libpod-conmon-7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47.scope: Deactivated successfully.
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bluefs mount
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bluefs mount shared_bdev_used = 4718592
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: RocksDB version: 7.9.2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Git sha 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: DB SUMMARY
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: DB Session ID:  YT76S9WB35YQ4FZZK94M
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: CURRENT file:  CURRENT
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                         Options.error_if_exists: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.create_if_missing: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                                     Options.env: 0x5566c7fd0460
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                                Options.info_log: 0x5566c6ff4980
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                              Options.statistics: (nil)
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.use_fsync: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                              Options.db_log_dir: 
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                                 Options.wal_dir: db.wal
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.write_buffer_manager: 0x5566c7f04460
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.unordered_write: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.row_cache: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                              Options.wal_filter: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.two_write_queues: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.wal_compression: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.atomic_flush: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.max_background_jobs: 4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.max_background_compactions: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.max_subcompactions: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.max_open_files: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Compression algorithms supported:
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         kZSTD supported: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         kXpressCompression supported: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         kBZip2Compression supported: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         kLZ4Compression supported: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         kZlibCompression supported: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         kLZ4HCCompression supported: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         kSnappyCompression supported: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc2d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc2d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc2d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc2d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc2d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc2d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdc2d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072720)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdd350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072720)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdd350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072720)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5566c6fdd350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ba850913-8330-42fb-9d78-1800ad716abe
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397116160664, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397116165347, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397116, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ba850913-8330-42fb-9d78-1800ad716abe", "db_session_id": "YT76S9WB35YQ4FZZK94M", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397116168047, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1593, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 467, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397116, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ba850913-8330-42fb-9d78-1800ad716abe", "db_session_id": "YT76S9WB35YQ4FZZK94M", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397116171775, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397116, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ba850913-8330-42fb-9d78-1800ad716abe", "db_session_id": "YT76S9WB35YQ4FZZK94M", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397116173098, "job": 1, "event": "recovery_finished"}
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5566c70afc00
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: DB pointer 0x5566c701fa00
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Nov 29 06:18:36 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:18:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Nov 29 06:18:36 compute-1 ceph-osd[78089]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Nov 29 06:18:36 compute-1 ceph-osd[78089]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Nov 29 06:18:36 compute-1 ceph-osd[78089]: _get_class not permitted to load lua
Nov 29 06:18:36 compute-1 ceph-osd[78089]: _get_class not permitted to load sdk
Nov 29 06:18:36 compute-1 ceph-osd[78089]: _get_class not permitted to load test_remote_reads
Nov 29 06:18:36 compute-1 ceph-osd[78089]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Nov 29 06:18:36 compute-1 ceph-osd[78089]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Nov 29 06:18:36 compute-1 ceph-osd[78089]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Nov 29 06:18:36 compute-1 ceph-osd[78089]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Nov 29 06:18:36 compute-1 ceph-osd[78089]: osd.0 0 load_pgs
Nov 29 06:18:36 compute-1 ceph-osd[78089]: osd.0 0 load_pgs opened 0 pgs
Nov 29 06:18:36 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0[78085]: 2025-11-29T06:18:36.205+0000 7ff69099e740 -1 osd.0 0 log_to_monitors true
Nov 29 06:18:36 compute-1 ceph-osd[78089]: osd.0 0 log_to_monitors true
Nov 29 06:18:36 compute-1 podman[78669]: 2025-11-29 06:18:36.261419716 +0000 UTC m=+0.051503795 container create 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 06:18:36 compute-1 systemd[1]: Started libpod-conmon-7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b.scope.
Nov 29 06:18:36 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:36 compute-1 podman[78669]: 2025-11-29 06:18:36.238916696 +0000 UTC m=+0.029000855 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d81638df49d455b99611a8fcce6fdd1f8848e459de23f369309554e109d9f8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d81638df49d455b99611a8fcce6fdd1f8848e459de23f369309554e109d9f8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d81638df49d455b99611a8fcce6fdd1f8848e459de23f369309554e109d9f8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d81638df49d455b99611a8fcce6fdd1f8848e459de23f369309554e109d9f8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:36 compute-1 podman[78669]: 2025-11-29 06:18:36.449645041 +0000 UTC m=+0.239729130 container init 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef)
Nov 29 06:18:36 compute-1 podman[78669]: 2025-11-29 06:18:36.457360682 +0000 UTC m=+0.247444771 container start 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Nov 29 06:18:36 compute-1 podman[78669]: 2025-11-29 06:18:36.555555274 +0000 UTC m=+0.345639393 container attach 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Nov 29 06:18:37 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Nov 29 06:18:37 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Nov 29 06:18:37 compute-1 ecstatic_thompson[78718]: {
Nov 29 06:18:37 compute-1 ecstatic_thompson[78718]:     "f793b967-de22-4105-bb0d-c91464bf150f": {
Nov 29 06:18:37 compute-1 ecstatic_thompson[78718]:         "ceph_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 06:18:37 compute-1 ecstatic_thompson[78718]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Nov 29 06:18:37 compute-1 ecstatic_thompson[78718]:         "osd_id": 0,
Nov 29 06:18:37 compute-1 ecstatic_thompson[78718]:         "osd_uuid": "f793b967-de22-4105-bb0d-c91464bf150f",
Nov 29 06:18:37 compute-1 ecstatic_thompson[78718]:         "type": "bluestore"
Nov 29 06:18:37 compute-1 ecstatic_thompson[78718]:     }
Nov 29 06:18:37 compute-1 ecstatic_thompson[78718]: }
Nov 29 06:18:37 compute-1 systemd[1]: libpod-7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b.scope: Deactivated successfully.
Nov 29 06:18:37 compute-1 podman[78669]: 2025-11-29 06:18:37.357205132 +0000 UTC m=+1.147289251 container died 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:18:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-49d81638df49d455b99611a8fcce6fdd1f8848e459de23f369309554e109d9f8-merged.mount: Deactivated successfully.
Nov 29 06:18:37 compute-1 podman[78669]: 2025-11-29 06:18:37.41963113 +0000 UTC m=+1.209715209 container remove 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:18:37 compute-1 systemd[1]: libpod-conmon-7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b.scope: Deactivated successfully.
Nov 29 06:18:37 compute-1 sudo[78184]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:37 compute-1 ceph-osd[78089]: osd.0 0 done with init, starting boot process
Nov 29 06:18:37 compute-1 ceph-osd[78089]: osd.0 0 start_boot
Nov 29 06:18:37 compute-1 ceph-osd[78089]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Nov 29 06:18:37 compute-1 ceph-osd[78089]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Nov 29 06:18:37 compute-1 ceph-osd[78089]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Nov 29 06:18:37 compute-1 ceph-osd[78089]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Nov 29 06:18:37 compute-1 ceph-osd[78089]: osd.0 0  bench count 12288000 bsize 4 KiB
Nov 29 06:18:37 compute-1 sudo[78754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:37 compute-1 sudo[78754]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:37 compute-1 sudo[78754]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:37 compute-1 sudo[78780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:18:37 compute-1 sudo[78780]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:37 compute-1 sudo[78780]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-1 sudo[78805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:38 compute-1 sudo[78805]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-1 sudo[78805]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-1 sudo[78830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:38 compute-1 sudo[78830]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-1 sudo[78830]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-1 sudo[78855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:38 compute-1 sudo[78855]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-1 sudo[78855]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:38 compute-1 sudo[78880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:18:38 compute-1 sudo[78880]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:38 compute-1 podman[78973]: 2025-11-29 06:18:38.968398794 +0000 UTC m=+0.085767057 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Nov 29 06:18:39 compute-1 podman[78973]: 2025-11-29 06:18:39.157876506 +0000 UTC m=+0.275244749 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:18:39 compute-1 sudo[78880]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:39 compute-1 sudo[79019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:39 compute-1 sudo[79019]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:39 compute-1 sudo[79019]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:39 compute-1 sudo[79044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:39 compute-1 sudo[79044]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:39 compute-1 sudo[79044]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:39 compute-1 sudo[79069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:39 compute-1 sudo[79069]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:39 compute-1 sudo[79069]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:39 compute-1 sudo[79094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:18:39 compute-1 sudo[79094]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:40 compute-1 sudo[79094]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:40 compute-1 sudo[79151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:40 compute-1 sudo[79151]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:40 compute-1 sudo[79151]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:40 compute-1 sudo[79176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:18:40 compute-1 sudo[79176]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:40 compute-1 sudo[79176]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:40 compute-1 sudo[79201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:18:40 compute-1 sudo[79201]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:40 compute-1 sudo[79201]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:40 compute-1 sudo[79226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- inventory --format=json-pretty --filter-for-batch
Nov 29 06:18:40 compute-1 sudo[79226]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:18:41 compute-1 podman[79291]: 2025-11-29 06:18:41.162672336 +0000 UTC m=+0.098177123 container create 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:18:41 compute-1 podman[79291]: 2025-11-29 06:18:41.106746403 +0000 UTC m=+0.042251240 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:41 compute-1 systemd[1]: Started libpod-conmon-68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6.scope.
Nov 29 06:18:41 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:41 compute-1 podman[79291]: 2025-11-29 06:18:41.304857217 +0000 UTC m=+0.240361974 container init 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:18:41 compute-1 podman[79291]: 2025-11-29 06:18:41.317155589 +0000 UTC m=+0.252660376 container start 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Nov 29 06:18:41 compute-1 pedantic_golick[79307]: 167 167
Nov 29 06:18:41 compute-1 systemd[1]: libpod-68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6.scope: Deactivated successfully.
Nov 29 06:18:41 compute-1 podman[79291]: 2025-11-29 06:18:41.343673469 +0000 UTC m=+0.279178216 container attach 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:18:41 compute-1 podman[79291]: 2025-11-29 06:18:41.344245392 +0000 UTC m=+0.279750169 container died 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Nov 29 06:18:41 compute-1 systemd[1]: var-lib-containers-storage-overlay-63bbdbab1fec8dc963214bf1ba81c5684776540ca9f8caafa8f344b10310b0e3-merged.mount: Deactivated successfully.
Nov 29 06:18:41 compute-1 podman[79291]: 2025-11-29 06:18:41.512655655 +0000 UTC m=+0.448160442 container remove 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:18:41 compute-1 systemd[1]: libpod-conmon-68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6.scope: Deactivated successfully.
Nov 29 06:18:41 compute-1 podman[79331]: 2025-11-29 06:18:41.761456006 +0000 UTC m=+0.090668997 container create 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:18:41 compute-1 podman[79331]: 2025-11-29 06:18:41.709141812 +0000 UTC m=+0.038354873 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:18:41 compute-1 systemd[1]: Started libpod-conmon-4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf.scope.
Nov 29 06:18:41 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:18:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36ffeb5645fb884e536821a50008b21babfed81117b73b78108d744dd5709a36/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36ffeb5645fb884e536821a50008b21babfed81117b73b78108d744dd5709a36/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36ffeb5645fb884e536821a50008b21babfed81117b73b78108d744dd5709a36/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36ffeb5645fb884e536821a50008b21babfed81117b73b78108d744dd5709a36/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:18:41 compute-1 podman[79331]: 2025-11-29 06:18:41.913687558 +0000 UTC m=+0.242900639 container init 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Nov 29 06:18:41 compute-1 podman[79331]: 2025-11-29 06:18:41.922530835 +0000 UTC m=+0.251743826 container start 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 06:18:41 compute-1 podman[79331]: 2025-11-29 06:18:41.947725165 +0000 UTC m=+0.276938166 container attach 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:18:43 compute-1 admiring_wing[79347]: [
Nov 29 06:18:43 compute-1 admiring_wing[79347]:     {
Nov 29 06:18:43 compute-1 admiring_wing[79347]:         "available": false,
Nov 29 06:18:43 compute-1 admiring_wing[79347]:         "ceph_device": false,
Nov 29 06:18:43 compute-1 admiring_wing[79347]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:         "lsm_data": {},
Nov 29 06:18:43 compute-1 admiring_wing[79347]:         "lvs": [],
Nov 29 06:18:43 compute-1 admiring_wing[79347]:         "path": "/dev/sr0",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:         "rejected_reasons": [
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "Has a FileSystem",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "Insufficient space (<5GB)"
Nov 29 06:18:43 compute-1 admiring_wing[79347]:         ],
Nov 29 06:18:43 compute-1 admiring_wing[79347]:         "sys_api": {
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "actuators": null,
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "device_nodes": "sr0",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "devname": "sr0",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "human_readable_size": "482.00 KB",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "id_bus": "ata",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "model": "QEMU DVD-ROM",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "nr_requests": "2",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "parent": "/dev/sr0",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "partitions": {},
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "path": "/dev/sr0",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "removable": "1",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "rev": "2.5+",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "ro": "0",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "rotational": "1",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "sas_address": "",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "sas_device_handle": "",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "scheduler_mode": "mq-deadline",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "sectors": 0,
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "sectorsize": "2048",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "size": 493568.0,
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "support_discard": "2048",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "type": "disk",
Nov 29 06:18:43 compute-1 admiring_wing[79347]:             "vendor": "QEMU"
Nov 29 06:18:43 compute-1 admiring_wing[79347]:         }
Nov 29 06:18:43 compute-1 admiring_wing[79347]:     }
Nov 29 06:18:43 compute-1 admiring_wing[79347]: ]
Nov 29 06:18:43 compute-1 systemd[1]: libpod-4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf.scope: Deactivated successfully.
Nov 29 06:18:43 compute-1 podman[79331]: 2025-11-29 06:18:43.271902737 +0000 UTC m=+1.601115718 container died 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Nov 29 06:18:43 compute-1 systemd[1]: libpod-4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf.scope: Consumed 1.353s CPU time.
Nov 29 06:18:43 compute-1 systemd[1]: var-lib-containers-storage-overlay-36ffeb5645fb884e536821a50008b21babfed81117b73b78108d744dd5709a36-merged.mount: Deactivated successfully.
Nov 29 06:18:43 compute-1 podman[79331]: 2025-11-29 06:18:43.476395812 +0000 UTC m=+1.805608793 container remove 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Nov 29 06:18:43 compute-1 systemd[1]: libpod-conmon-4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf.scope: Deactivated successfully.
Nov 29 06:18:43 compute-1 ceph-osd[78089]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 11.852 iops: 3033.996 elapsed_sec: 0.989
Nov 29 06:18:43 compute-1 ceph-osd[78089]: log_channel(cluster) log [WRN] : OSD bench result of 3033.995593 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 29 06:18:43 compute-1 ceph-osd[78089]: osd.0 0 waiting for initial osdmap
Nov 29 06:18:43 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0[78085]: 2025-11-29T06:18:43.486+0000 7ff68c91e640 -1 osd.0 0 waiting for initial osdmap
Nov 29 06:18:43 compute-1 sudo[79226]: pam_unix(sudo:session): session closed for user root
Nov 29 06:18:43 compute-1 ceph-osd[78089]: osd.0 7 crush map has features 288514050185494528, adjusting msgr requires for clients
Nov 29 06:18:43 compute-1 ceph-osd[78089]: osd.0 7 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Nov 29 06:18:43 compute-1 ceph-osd[78089]: osd.0 7 crush map has features 3314932999778484224, adjusting msgr requires for osds
Nov 29 06:18:43 compute-1 ceph-osd[78089]: osd.0 7 check_osdmap_features require_osd_release unknown -> reef
Nov 29 06:18:43 compute-1 ceph-osd[78089]: osd.0 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 29 06:18:43 compute-1 ceph-osd[78089]: osd.0 7 set_numa_affinity not setting numa affinity
Nov 29 06:18:43 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0[78085]: 2025-11-29T06:18:43.659+0000 7ff687f46640 -1 osd.0 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 29 06:18:43 compute-1 ceph-osd[78089]: osd.0 7 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Nov 29 06:18:43 compute-1 sshd-session[77893]: error: kex_exchange_identification: read: Connection timed out
Nov 29 06:18:43 compute-1 sshd-session[77893]: banner exchange: Connection from 112.4.79.138 port 54394: Connection timed out
Nov 29 06:18:44 compute-1 ceph-osd[78089]: osd.0 7 tick checking mon for new map
Nov 29 06:18:46 compute-1 ceph-osd[78089]: osd.0 8 state: booting -> active
Nov 29 06:18:51 compute-1 ceph-osd[78089]: osd.0 11 crush map has features 288514051259236352, adjusting msgr requires for clients
Nov 29 06:18:51 compute-1 ceph-osd[78089]: osd.0 11 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Nov 29 06:18:51 compute-1 ceph-osd[78089]: osd.0 11 crush map has features 3314933000852226048, adjusting msgr requires for osds
Nov 29 06:18:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 10 pg[1.0( empty local-lis/les=0/0 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=10) [0] r=0 lpr=10 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 12 pg[2.0( empty local-lis/les=0/0 n=0 ec=12/12 lis/c=0/0 les/c/f=0/0/0 sis=12) [0] r=0 lpr=12 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 12 pg[1.0( empty local-lis/les=10/12 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=10) [0] r=0 lpr=10 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:18:53 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 13 pg[2.0( empty local-lis/les=12/13 n=0 ec=12/12 lis/c=0/0 les/c/f=0/0/0 sis=12) [0] r=0 lpr=12 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 17 pg[2.0( empty local-lis/les=12/13 n=0 ec=12/12 lis/c=12/12 les/c/f=13/13/0 sis=17 pruub=10.059963226s) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active pruub 33.613990784s@ mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.0( empty local-lis/les=12/13 n=0 ec=12/12 lis/c=12/12 les/c/f=13/13/0 sis=17 pruub=10.059963226s) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown pruub 33.613990784s@ mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.5( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.4( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.2( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.8( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.3( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.9( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.6( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.7( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.b( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.a( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.c( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.d( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.e( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.f( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.10( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.11( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.12( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1a( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.13( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.14( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1b( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.16( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.15( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.17( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1e( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1f( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1c( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1d( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.18( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:18:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.19( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.a( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.9( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.7( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.8( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.5( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.6( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.4( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.0( empty local-lis/les=17/19 n=0 ec=12/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.3( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.2( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.10( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.13( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.11( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.15( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.14( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.16( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.12( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.17( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.18( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1a( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.19( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:00 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:01 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1 deep-scrub starts
Nov 29 06:19:01 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1 deep-scrub ok
Nov 29 06:19:02 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Nov 29 06:19:02 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Nov 29 06:19:04 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Nov 29 06:19:04 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Nov 29 06:19:04 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 21 pg[7.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:19:05 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Nov 29 06:19:05 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Nov 29 06:19:05 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 22 pg[7.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:19:06 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Nov 29 06:19:06 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Nov 29 06:19:07 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.6 deep-scrub starts
Nov 29 06:19:07 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.6 deep-scrub ok
Nov 29 06:19:08 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Nov 29 06:19:08 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.235153198s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.618747711s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238571167s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622200012s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.235013008s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.618747711s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.a( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238540649s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622329712s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238471031s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622200012s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.a( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238486290s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622329712s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.6( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238062859s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622528076s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.6( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238033295s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622528076s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.4( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237872124s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622528076s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.4( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237847328s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622528076s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237704277s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622562408s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237682343s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622562408s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237563133s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622646332s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237541199s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622646332s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237453461s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622661591s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237430573s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622661591s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237401009s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622741699s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.9( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237115860s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622425079s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237380028s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622741699s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.10( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237191200s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622676849s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.13( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237181664s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622734070s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.10( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237103462s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622676849s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.15( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237176895s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622810364s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.13( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237123489s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622734070s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.15( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237146378s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622810364s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.9( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.236943245s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622425079s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.19( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.236249924s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622917175s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.236279488s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622924805s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.19( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.236173630s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622917175s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:08 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.236118317s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622924805s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:11 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Nov 29 06:19:11 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Nov 29 06:19:12 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.b scrub starts
Nov 29 06:19:12 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.b scrub ok
Nov 29 06:19:18 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.f deep-scrub starts
Nov 29 06:19:18 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.f deep-scrub ok
Nov 29 06:19:18 compute-1 sudo[80381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:18 compute-1 sudo[80381]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:18 compute-1 sudo[80381]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:18 compute-1 sudo[80406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:18 compute-1 sudo[80406]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:18 compute-1 sudo[80406]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:18 compute-1 sudo[80431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:18 compute-1 sudo[80431]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:18 compute-1 sudo[80431]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:18 compute-1 sudo[80456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:18 compute-1 sudo[80456]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:19 compute-1 podman[80521]: 2025-11-29 06:19:19.359686699 +0000 UTC m=+0.067614163 container create ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:19:19 compute-1 podman[80521]: 2025-11-29 06:19:19.317525072 +0000 UTC m=+0.025452516 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:19 compute-1 systemd[1]: Started libpod-conmon-ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4.scope.
Nov 29 06:19:19 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:19:19 compute-1 podman[80521]: 2025-11-29 06:19:19.499093687 +0000 UTC m=+0.207021211 container init ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Nov 29 06:19:19 compute-1 podman[80521]: 2025-11-29 06:19:19.511517674 +0000 UTC m=+0.219445138 container start ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:19:19 compute-1 flamboyant_bardeen[80538]: 167 167
Nov 29 06:19:19 compute-1 systemd[1]: libpod-ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4.scope: Deactivated successfully.
Nov 29 06:19:19 compute-1 podman[80521]: 2025-11-29 06:19:19.533347909 +0000 UTC m=+0.241275333 container attach ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:19 compute-1 podman[80521]: 2025-11-29 06:19:19.534371272 +0000 UTC m=+0.242298696 container died ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-f8d6376ba1ffb61aae378cdd04af6587cedb52dad915bac4e0d41fffce32b962-merged.mount: Deactivated successfully.
Nov 29 06:19:19 compute-1 podman[80521]: 2025-11-29 06:19:19.837244144 +0000 UTC m=+0.545171608 container remove ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:19:19 compute-1 systemd[1]: libpod-conmon-ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4.scope: Deactivated successfully.
Nov 29 06:19:20 compute-1 podman[80558]: 2025-11-29 06:19:20.006343193 +0000 UTC m=+0.123754573 container create 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:20 compute-1 podman[80558]: 2025-11-29 06:19:19.924315089 +0000 UTC m=+0.041726489 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:20 compute-1 systemd[1]: Started libpod-conmon-318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819.scope.
Nov 29 06:19:20 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:19:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cc792b2a68caa0e6d619839f761005b4a61dd5abfcc89b0469664e74e7b305d/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cc792b2a68caa0e6d619839f761005b4a61dd5abfcc89b0469664e74e7b305d/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cc792b2a68caa0e6d619839f761005b4a61dd5abfcc89b0469664e74e7b305d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cc792b2a68caa0e6d619839f761005b4a61dd5abfcc89b0469664e74e7b305d/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:20 compute-1 podman[80558]: 2025-11-29 06:19:20.079411677 +0000 UTC m=+0.196823077 container init 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:20 compute-1 podman[80558]: 2025-11-29 06:19:20.089365368 +0000 UTC m=+0.206776748 container start 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 06:19:20 compute-1 podman[80558]: 2025-11-29 06:19:20.09623029 +0000 UTC m=+0.213641690 container attach 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Nov 29 06:19:20 compute-1 systemd[1]: libpod-318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819.scope: Deactivated successfully.
Nov 29 06:19:20 compute-1 podman[80558]: 2025-11-29 06:19:20.468713879 +0000 UTC m=+0.586125299 container died 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Nov 29 06:19:20 compute-1 systemd[1]: var-lib-containers-storage-overlay-9cc792b2a68caa0e6d619839f761005b4a61dd5abfcc89b0469664e74e7b305d-merged.mount: Deactivated successfully.
Nov 29 06:19:20 compute-1 podman[80558]: 2025-11-29 06:19:20.579477821 +0000 UTC m=+0.696889211 container remove 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:20 compute-1 systemd[1]: libpod-conmon-318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819.scope: Deactivated successfully.
Nov 29 06:19:20 compute-1 systemd[1]: Reloading.
Nov 29 06:19:20 compute-1 systemd-rc-local-generator[80640]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:20 compute-1 systemd-sysv-generator[80643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:21 compute-1 systemd[1]: Reloading.
Nov 29 06:19:21 compute-1 systemd-sysv-generator[80683]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:21 compute-1 systemd-rc-local-generator[80679]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:21 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Nov 29 06:19:21 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Nov 29 06:19:21 compute-1 systemd[1]: Starting Ceph mon.compute-1 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:19:21 compute-1 podman[80734]: 2025-11-29 06:19:21.646084309 +0000 UTC m=+0.074928517 container create 6c6562254e3e4a8763ba2492de731371242690cebdb554011375bcced4c68e5c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:21 compute-1 podman[80734]: 2025-11-29 06:19:21.600353442 +0000 UTC m=+0.029197640 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c88a17628323222b9dfc9d24fc0d50145acb7a393baa442577f5e0ba9ac7f73/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c88a17628323222b9dfc9d24fc0d50145acb7a393baa442577f5e0ba9ac7f73/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c88a17628323222b9dfc9d24fc0d50145acb7a393baa442577f5e0ba9ac7f73/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c88a17628323222b9dfc9d24fc0d50145acb7a393baa442577f5e0ba9ac7f73/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:21 compute-1 podman[80734]: 2025-11-29 06:19:21.724093913 +0000 UTC m=+0.152938071 container init 6c6562254e3e4a8763ba2492de731371242690cebdb554011375bcced4c68e5c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-1, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Nov 29 06:19:21 compute-1 podman[80734]: 2025-11-29 06:19:21.728779357 +0000 UTC m=+0.157623515 container start 6c6562254e3e4a8763ba2492de731371242690cebdb554011375bcced4c68e5c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-1, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:19:21 compute-1 bash[80734]: 6c6562254e3e4a8763ba2492de731371242690cebdb554011375bcced4c68e5c
Nov 29 06:19:21 compute-1 systemd[1]: Started Ceph mon.compute-1 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:19:21 compute-1 ceph-mon[80754]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:19:21 compute-1 ceph-mon[80754]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Nov 29 06:19:21 compute-1 ceph-mon[80754]: pidfile_write: ignore empty --pid-file
Nov 29 06:19:21 compute-1 ceph-mon[80754]: load: jerasure load: lrc 
Nov 29 06:19:21 compute-1 sudo[80456]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: RocksDB version: 7.9.2
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Git sha 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: DB SUMMARY
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: DB Session ID:  5Q1WIIQG9BN5XI35108Y
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: CURRENT file:  CURRENT
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 511 ; 
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                         Options.error_if_exists: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                       Options.create_if_missing: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                                     Options.env: 0x562154236c40
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                                      Options.fs: PosixFileSystem
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                                Options.info_log: 0x562155f78fc0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                              Options.statistics: (nil)
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                               Options.use_fsync: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                              Options.db_log_dir: 
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                                 Options.wal_dir: 
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                    Options.write_buffer_manager: 0x562155f88b40
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                  Options.unordered_write: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                               Options.row_cache: None
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                              Options.wal_filter: None
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.two_write_queues: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.wal_compression: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.atomic_flush: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.max_background_jobs: 2
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.max_background_compactions: -1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.max_subcompactions: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.max_total_wal_size: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                          Options.max_open_files: -1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:       Options.compaction_readahead_size: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Compression algorithms supported:
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         kZSTD supported: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         kXpressCompression supported: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         kBZip2Compression supported: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         kLZ4Compression supported: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         kZlibCompression supported: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         kLZ4HCCompression supported: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         kSnappyCompression supported: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:           Options.merge_operator: 
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:        Options.compaction_filter: None
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562155f78c00)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x562155f711f0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:        Options.write_buffer_size: 33554432
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:  Options.max_write_buffer_number: 2
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:          Options.compression: NoCompression
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.num_levels: 7
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                           Options.bloom_locality: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                               Options.ttl: 2592000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                       Options.enable_blob_files: false
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                           Options.min_blob_size: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397161777911, "job": 1, "event": "recovery_started", "wal_files": [4]}
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397161885594, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397161885807, "job": 1, "event": "recovery_finished"}
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x562155f9ae00
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: DB pointer 0x5621560a2000
Nov 29 06:19:21 compute-1 ceph-mon[80754]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:19:21 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0
                                            Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x562155f711f0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Nov 29 06:19:21 compute-1 ceph-mon[80754]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Nov 29 06:19:21 compute-1 ceph-mon[80754]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:21 compute-1 ceph-mon[80754]: mon.compute-1@-1(???) e0 preinit fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).mds e1 new map
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).mds e1 print_map
                                           e1
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 1 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 1 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e28 e28: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e29 e29: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e29 crush map has features 3314933000852226048, adjusting msgr requires
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e13: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v66: 2 pgs: 1 unknown, 1 creating+peering; 0 B data, 853 MiB used, 13 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/577122409' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/577122409' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mgrmap e8: compute-0.vxabpq(active, since 2m)
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e14: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v68: 3 pgs: 2 unknown, 1 creating+peering; 0 B data, 453 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e15: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1457732535' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1457732535' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e16: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v71: 4 pgs: 2 active+clean, 2 unknown; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e17: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2491487437' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2491487437' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e18: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v74: 36 pgs: 2 active+clean, 34 unknown; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e19: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.1 deep-scrub starts
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.1 deep-scrub ok
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2900095816' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2900095816' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e20: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v77: 37 pgs: 1 unknown, 1 creating+peering, 35 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.2 scrub starts
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.2 scrub ok
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/956031255' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v78: 37 pgs: 1 unknown, 1 creating+peering, 35 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.3 scrub starts
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.3 scrub ok
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/956031255' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e21: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.4 scrub starts
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.4 scrub ok
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e22: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2774593808' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v81: 38 pgs: 1 unknown, 37 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.5 scrub starts
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.5 scrub ok
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Health check update: 6 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2774593808' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e23: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.6 deep-scrub starts
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.6 deep-scrub ok
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3785446785' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v83: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.7 scrub starts
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.7 scrub ok
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3785446785' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e24: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e25: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v86: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3924631149' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3924631149' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e26: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.8 scrub starts
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.8 scrub ok
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.e scrub starts
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.e scrub ok
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Updating compute-2:/etc/ceph/ceph.conf
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v88: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.b scrub starts
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.b scrub ok
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/935132046' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/935132046' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e27: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v90: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1714792720' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1714792720' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e28: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v92: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v93: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: pgmap v94: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Deploying daemon mon.compute-2 on compute-2
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Health check cleared: CEPHADM_REFRESH_FAILED (was: failed to probe daemons or devices)
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2338482810' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Nov 29 06:19:22 compute-1 ceph-mon[80754]: Health check update: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:19:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2338482810' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Nov 29 06:19:22 compute-1 ceph-mon[80754]: osdmap e29: 2 total, 2 up, 2 in
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.a scrub starts
Nov 29 06:19:22 compute-1 ceph-mon[80754]: 2.a scrub ok
Nov 29 06:19:22 compute-1 ceph-mon[80754]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Nov 29 06:19:22 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Nov 29 06:19:22 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Nov 29 06:19:25 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Nov 29 06:19:25 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Nov 29 06:19:26 compute-1 ceph-mon[80754]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Nov 29 06:19:26 compute-1 ceph-mon[80754]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Nov 29 06:19:26 compute-1 ceph-mon[80754]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 29 06:19:26 compute-1 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:19:27 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Nov 29 06:19:27 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Nov 29 06:19:28 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Nov 29 06:19:28 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.f deep-scrub starts
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.f deep-scrub ok
Nov 29 06:19:29 compute-1 ceph-mon[80754]: Deploying daemon mon.compute-1 on compute-1
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: pgmap v97: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.11 scrub starts
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.11 scrub ok
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: pgmap v98: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.12 scrub starts
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.12 scrub ok
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.c scrub starts
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.c scrub ok
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Nov 29 06:19:29 compute-1 ceph-mon[80754]: monmap e2: 2 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],compute-2=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]} removed_ranks: {} disallowed_leaders: {}
Nov 29 06:19:29 compute-1 ceph-mon[80754]: fsmap 
Nov 29 06:19:29 compute-1 ceph-mon[80754]: osdmap e29: 2 total, 2 up, 2 in
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mgrmap e8: compute-0.vxabpq(active, since 2m)
Nov 29 06:19:29 compute-1 ceph-mon[80754]: overall HEALTH_OK
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2025-11-29T06:19:20.235344Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,os=Linux}
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:29 compute-1 ceph-mon[80754]: pgmap v99: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:29 compute-1 ceph-mon[80754]: Deploying daemon mgr.compute-2.ngsyhe on compute-2
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/501439537' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.14 scrub starts
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.14 scrub ok
Nov 29 06:19:29 compute-1 ceph-mon[80754]: pgmap v100: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-1 calling monitor election
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.d scrub starts
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.d scrub ok
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.16 scrub starts
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.16 scrub ok
Nov 29 06:19:29 compute-1 ceph-mon[80754]: pgmap v101: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.17 scrub starts
Nov 29 06:19:29 compute-1 ceph-mon[80754]: 2.17 scrub ok
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 06:19:29 compute-1 ceph-mon[80754]: monmap e3: 3 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],compute-1=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],compute-2=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]} removed_ranks: {} disallowed_leaders: {}
Nov 29 06:19:29 compute-1 ceph-mon[80754]: fsmap 
Nov 29 06:19:29 compute-1 ceph-mon[80754]: osdmap e29: 2 total, 2 up, 2 in
Nov 29 06:19:29 compute-1 ceph-mon[80754]: mgrmap e8: compute-0.vxabpq(active, since 2m)
Nov 29 06:19:29 compute-1 ceph-mon[80754]: overall HEALTH_OK
Nov 29 06:19:29 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/501439537' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Nov 29 06:19:29 compute-1 sudo[80793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:29 compute-1 sudo[80793]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:29 compute-1 sudo[80793]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:29 compute-1 sudo[80818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:29 compute-1 sudo[80818]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:29 compute-1 sudo[80818]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:29 compute-1 sudo[80843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:29 compute-1 sudo[80843]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:29 compute-1 sudo[80843]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:29 compute-1 sudo[80868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:19:29 compute-1 sudo[80868]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:30 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Nov 29 06:19:30 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Nov 29 06:19:30 compute-1 podman[80934]: 2025-11-29 06:19:30.367056762 +0000 UTC m=+0.054811536 container create ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Nov 29 06:19:30 compute-1 systemd[72642]: Starting Mark boot as successful...
Nov 29 06:19:30 compute-1 systemd[1]: Started libpod-conmon-ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17.scope.
Nov 29 06:19:30 compute-1 systemd[72642]: Finished Mark boot as successful.
Nov 29 06:19:30 compute-1 podman[80934]: 2025-11-29 06:19:30.342121769 +0000 UTC m=+0.029876553 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:30 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:19:30 compute-1 podman[80934]: 2025-11-29 06:19:30.460239555 +0000 UTC m=+0.147994329 container init ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:30 compute-1 podman[80934]: 2025-11-29 06:19:30.473560336 +0000 UTC m=+0.161315110 container start ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507)
Nov 29 06:19:30 compute-1 podman[80934]: 2025-11-29 06:19:30.478437351 +0000 UTC m=+0.166192115 container attach ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:30 compute-1 goofy_mcclintock[80951]: 167 167
Nov 29 06:19:30 compute-1 systemd[1]: libpod-ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17.scope: Deactivated successfully.
Nov 29 06:19:30 compute-1 podman[80934]: 2025-11-29 06:19:30.483724869 +0000 UTC m=+0.171479643 container died ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:19:30 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:30 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:30 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:30 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:30 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gaxpay", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 29 06:19:30 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gaxpay", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 29 06:19:30 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:19:30 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:30 compute-1 ceph-mon[80754]: Deploying daemon mgr.compute-1.gaxpay on compute-1
Nov 29 06:19:30 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Nov 29 06:19:30 compute-1 ceph-mon[80754]: 2.18 scrub starts
Nov 29 06:19:30 compute-1 ceph-mon[80754]: 2.18 scrub ok
Nov 29 06:19:30 compute-1 systemd[1]: var-lib-containers-storage-overlay-bc21a13aab8904e76099b5019bb20a83d33371e7a6924527526aec83b2b0cadb-merged.mount: Deactivated successfully.
Nov 29 06:19:30 compute-1 podman[80934]: 2025-11-29 06:19:30.539262213 +0000 UTC m=+0.227016987 container remove ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Nov 29 06:19:30 compute-1 systemd[1]: libpod-conmon-ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17.scope: Deactivated successfully.
Nov 29 06:19:30 compute-1 systemd[1]: Reloading.
Nov 29 06:19:30 compute-1 systemd-rc-local-generator[80995]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:30 compute-1 systemd-sysv-generator[80999]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:30 compute-1 systemd[1]: Reloading.
Nov 29 06:19:31 compute-1 systemd-rc-local-generator[81038]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:31 compute-1 systemd-sysv-generator[81042]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:19:31 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1a deep-scrub starts
Nov 29 06:19:31 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1a deep-scrub ok
Nov 29 06:19:31 compute-1 systemd[1]: Starting Ceph mgr.compute-1.gaxpay for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:19:31 compute-1 podman[81097]: 2025-11-29 06:19:31.522695665 +0000 UTC m=+0.058182730 container create a8b9f68ee8f2adfa4e2e87b3aa53b3fca9f9c47317ab426c07bfc0ef9f58e64c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Nov 29 06:19:31 compute-1 podman[81097]: 2025-11-29 06:19:31.491901288 +0000 UTC m=+0.027388403 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a5a650bd8be7eb3991a168818abad049a3e29f9b126c9e3b83ec22bdd71fbe1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a5a650bd8be7eb3991a168818abad049a3e29f9b126c9e3b83ec22bdd71fbe1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a5a650bd8be7eb3991a168818abad049a3e29f9b126c9e3b83ec22bdd71fbe1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:31 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a5a650bd8be7eb3991a168818abad049a3e29f9b126c9e3b83ec22bdd71fbe1/merged/var/lib/ceph/mgr/ceph-compute-1.gaxpay supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:31 compute-1 ceph-mon[80754]: pgmap v102: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:31 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2714267067' entity='client.admin' 
Nov 29 06:19:31 compute-1 ceph-mon[80754]: 2.1a deep-scrub starts
Nov 29 06:19:31 compute-1 ceph-mon[80754]: 2.1a deep-scrub ok
Nov 29 06:19:31 compute-1 podman[81097]: 2025-11-29 06:19:31.611064573 +0000 UTC m=+0.146551688 container init a8b9f68ee8f2adfa4e2e87b3aa53b3fca9f9c47317ab426c07bfc0ef9f58e64c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:19:31 compute-1 podman[81097]: 2025-11-29 06:19:31.619481828 +0000 UTC m=+0.154968873 container start a8b9f68ee8f2adfa4e2e87b3aa53b3fca9f9c47317ab426c07bfc0ef9f58e64c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:31 compute-1 bash[81097]: a8b9f68ee8f2adfa4e2e87b3aa53b3fca9f9c47317ab426c07bfc0ef9f58e64c
Nov 29 06:19:31 compute-1 systemd[1]: Started Ceph mgr.compute-1.gaxpay for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:19:31 compute-1 sudo[80868]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:31 compute-1 ceph-mgr[81116]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:19:31 compute-1 ceph-mgr[81116]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Nov 29 06:19:31 compute-1 ceph-mgr[81116]: pidfile_write: ignore empty --pid-file
Nov 29 06:19:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Nov 29 06:19:31 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'alerts'
Nov 29 06:19:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e29 _set_new_cache_sizes cache_size:1019927404 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:19:32 compute-1 ceph-mgr[81116]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 29 06:19:32 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'balancer'
Nov 29 06:19:32 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:32.259+0000 7f8d11e36140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 29 06:19:32 compute-1 ceph-mgr[81116]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 29 06:19:32 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'cephadm'
Nov 29 06:19:32 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:32.603+0000 7f8d11e36140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 29 06:19:32 compute-1 ceph-mon[80754]: 2.1e scrub starts
Nov 29 06:19:32 compute-1 ceph-mon[80754]: 2.1e scrub ok
Nov 29 06:19:32 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-1 ceph-mon[80754]: pgmap v103: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:32 compute-1 ceph-mon[80754]: from='client.14256 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:19:32 compute-1 ceph-mon[80754]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Nov 29 06:19:32 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-1 ceph-mon[80754]: Saving service ingress.rgw.default spec with placement count:2
Nov 29 06:19:32 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 06:19:32 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:32 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Nov 29 06:19:32 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:32 compute-1 ceph-mon[80754]: Deploying daemon crash.compute-2 on compute-2
Nov 29 06:19:33 compute-1 ceph-mon[80754]: 2.1f scrub starts
Nov 29 06:19:33 compute-1 ceph-mon[80754]: 2.1f scrub ok
Nov 29 06:19:34 compute-1 sudo[81152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:34 compute-1 sudo[81152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:34 compute-1 sudo[81152]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:34 compute-1 sudo[81177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:34 compute-1 sudo[81177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:34 compute-1 sudo[81177]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:34 compute-1 sudo[81202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:34 compute-1 sudo[81202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:34 compute-1 sudo[81202]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:34 compute-1 sudo[81227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Nov 29 06:19:34 compute-1 sudo[81227]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:34 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'crash'
Nov 29 06:19:35 compute-1 podman[81290]: 2025-11-29 06:19:35.000949576 +0000 UTC m=+0.045536127 container create 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True)
Nov 29 06:19:35 compute-1 ceph-mgr[81116]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 29 06:19:35 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'dashboard'
Nov 29 06:19:35 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:35.013+0000 7f8d11e36140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 29 06:19:35 compute-1 systemd[1]: Started libpod-conmon-675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3.scope.
Nov 29 06:19:35 compute-1 podman[81290]: 2025-11-29 06:19:34.981242158 +0000 UTC m=+0.025828779 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:35 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:19:35 compute-1 podman[81290]: 2025-11-29 06:19:35.110543945 +0000 UTC m=+0.155130496 container init 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2)
Nov 29 06:19:35 compute-1 podman[81290]: 2025-11-29 06:19:35.121650754 +0000 UTC m=+0.166237315 container start 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Nov 29 06:19:35 compute-1 podman[81290]: 2025-11-29 06:19:35.128402722 +0000 UTC m=+0.172989303 container attach 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:35 compute-1 stoic_volhard[81306]: 167 167
Nov 29 06:19:35 compute-1 systemd[1]: libpod-675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3.scope: Deactivated successfully.
Nov 29 06:19:35 compute-1 podman[81290]: 2025-11-29 06:19:35.135025827 +0000 UTC m=+0.179612368 container died 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 06:19:35 compute-1 systemd[1]: var-lib-containers-storage-overlay-7c965bbd6278b0896e99215cf57fa69530bd7c85476f7d3218ed7e7c40963309-merged.mount: Deactivated successfully.
Nov 29 06:19:35 compute-1 podman[81290]: 2025-11-29 06:19:35.183177636 +0000 UTC m=+0.227764157 container remove 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Nov 29 06:19:35 compute-1 systemd[1]: libpod-conmon-675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3.scope: Deactivated successfully.
Nov 29 06:19:35 compute-1 ceph-mon[80754]: pgmap v104: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:35 compute-1 ceph-mon[80754]: 2.10 scrub starts
Nov 29 06:19:35 compute-1 ceph-mon[80754]: 2.10 scrub ok
Nov 29 06:19:35 compute-1 podman[81331]: 2025-11-29 06:19:35.388372905 +0000 UTC m=+0.078094603 container create ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Nov 29 06:19:35 compute-1 systemd[1]: Started libpod-conmon-ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0.scope.
Nov 29 06:19:35 compute-1 podman[81331]: 2025-11-29 06:19:35.356333144 +0000 UTC m=+0.046054892 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:35 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:19:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:35 compute-1 podman[81331]: 2025-11-29 06:19:35.519189825 +0000 UTC m=+0.208911563 container init ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Nov 29 06:19:35 compute-1 podman[81331]: 2025-11-29 06:19:35.535406146 +0000 UTC m=+0.225127824 container start ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:35 compute-1 podman[81331]: 2025-11-29 06:19:35.539549912 +0000 UTC m=+0.229271580 container attach ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:19:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e2 new map
Nov 29 06:19:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e2 print_map
                                           e2
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:19:35.589013+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                            
                                            
Nov 29 06:19:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e30 e30: 2 total, 2 up, 2 in
Nov 29 06:19:36 compute-1 ceph-mon[80754]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "compute-0 compute-1 compute-2 ", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:19:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Nov 29 06:19:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Nov 29 06:19:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Nov 29 06:19:36 compute-1 ceph-mon[80754]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Nov 29 06:19:36 compute-1 ceph-mon[80754]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Nov 29 06:19:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Nov 29 06:19:36 compute-1 ceph-mon[80754]: osdmap e30: 2 total, 2 up, 2 in
Nov 29 06:19:36 compute-1 ceph-mon[80754]: fsmap cephfs:0
Nov 29 06:19:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:36 compute-1 upbeat_ellis[81347]: --> passed data devices: 0 physical, 1 LVM
Nov 29 06:19:36 compute-1 upbeat_ellis[81347]: --> relative data size: 1.0
Nov 29 06:19:36 compute-1 upbeat_ellis[81347]: --> All data devices are unavailable
Nov 29 06:19:36 compute-1 systemd[1]: libpod-ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0.scope: Deactivated successfully.
Nov 29 06:19:36 compute-1 podman[81362]: 2025-11-29 06:19:36.417189189 +0000 UTC m=+0.026466027 container died ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Nov 29 06:19:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2-merged.mount: Deactivated successfully.
Nov 29 06:19:36 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'devicehealth'
Nov 29 06:19:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e31 e31: 3 total, 2 up, 3 in
Nov 29 06:19:36 compute-1 podman[81362]: 2025-11-29 06:19:36.568600202 +0000 UTC m=+0.177877020 container remove ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:19:36 compute-1 systemd[1]: libpod-conmon-ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0.scope: Deactivated successfully.
Nov 29 06:19:36 compute-1 sudo[81227]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:36 compute-1 sudo[81377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:36 compute-1 sudo[81377]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:36 compute-1 sudo[81377]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:36 compute-1 sudo[81402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:36 compute-1 sudo[81402]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:36 compute-1 sudo[81402]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:36 compute-1 ceph-mgr[81116]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 29 06:19:36 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'diskprediction_local'
Nov 29 06:19:36 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:36.819+0000 7f8d11e36140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 29 06:19:36 compute-1 sudo[81427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:36 compute-1 sudo[81427]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:36 compute-1 sudo[81427]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020053102 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:19:36 compute-1 sudo[81452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- lvm list --format json
Nov 29 06:19:36 compute-1 sudo[81452]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:37 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Nov 29 06:19:37 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Nov 29 06:19:37 compute-1 sshd-session[81498]: error: kex_exchange_identification: read: Connection reset by peer
Nov 29 06:19:37 compute-1 sshd-session[81498]: Connection reset by 45.140.17.97 port 32538
Nov 29 06:19:37 compute-1 podman[81519]: 2025-11-29 06:19:37.273435461 +0000 UTC m=+0.040187799 container create e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:19:37 compute-1 systemd[1]: Started libpod-conmon-e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1.scope.
Nov 29 06:19:37 compute-1 ceph-mon[80754]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Nov 29 06:19:37 compute-1 ceph-mon[80754]: pgmap v106: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:37 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2624547066' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]: dispatch
Nov 29 06:19:37 compute-1 ceph-mon[80754]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]: dispatch
Nov 29 06:19:37 compute-1 ceph-mon[80754]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]': finished
Nov 29 06:19:37 compute-1 ceph-mon[80754]: osdmap e31: 3 total, 2 up, 3 in
Nov 29 06:19:37 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:37 compute-1 ceph-mon[80754]: 2.1c scrub starts
Nov 29 06:19:37 compute-1 ceph-mon[80754]: 2.1c scrub ok
Nov 29 06:19:37 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:37 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2894938433' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Nov 29 06:19:37 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:19:37 compute-1 podman[81519]: 2025-11-29 06:19:37.257268742 +0000 UTC m=+0.024021110 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:37 compute-1 podman[81519]: 2025-11-29 06:19:37.366271954 +0000 UTC m=+0.133024312 container init e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:37 compute-1 podman[81519]: 2025-11-29 06:19:37.37871557 +0000 UTC m=+0.145467918 container start e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:37 compute-1 podman[81519]: 2025-11-29 06:19:37.383071981 +0000 UTC m=+0.149824339 container attach e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2)
Nov 29 06:19:37 compute-1 optimistic_wing[81535]: 167 167
Nov 29 06:19:37 compute-1 systemd[1]: libpod-e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1.scope: Deactivated successfully.
Nov 29 06:19:37 compute-1 podman[81519]: 2025-11-29 06:19:37.385724505 +0000 UTC m=+0.152476863 container died e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Nov 29 06:19:37 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 29 06:19:37 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 29 06:19:37 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]:   from numpy import show_config as show_numpy_config
Nov 29 06:19:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-57188317ae84602b08fa483a67ef1e66b83325c8218fd4118670772c88cd54e8-merged.mount: Deactivated successfully.
Nov 29 06:19:37 compute-1 ceph-mgr[81116]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 29 06:19:37 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:37.418+0000 7f8d11e36140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 29 06:19:37 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'influx'
Nov 29 06:19:37 compute-1 podman[81519]: 2025-11-29 06:19:37.436402495 +0000 UTC m=+0.203154853 container remove e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:37 compute-1 systemd[1]: libpod-conmon-e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1.scope: Deactivated successfully.
Nov 29 06:19:37 compute-1 podman[81559]: 2025-11-29 06:19:37.61414869 +0000 UTC m=+0.049041945 container create 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:37 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:37.656+0000 7f8d11e36140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 29 06:19:37 compute-1 ceph-mgr[81116]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 29 06:19:37 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'insights'
Nov 29 06:19:37 compute-1 systemd[1]: Started libpod-conmon-8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5.scope.
Nov 29 06:19:37 compute-1 podman[81559]: 2025-11-29 06:19:37.59471116 +0000 UTC m=+0.029604445 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:37 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:19:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb588a27fcacf5a0d04b5f02fb51c7b0b9ba2f101c539581f9956da9fc72d395/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb588a27fcacf5a0d04b5f02fb51c7b0b9ba2f101c539581f9956da9fc72d395/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb588a27fcacf5a0d04b5f02fb51c7b0b9ba2f101c539581f9956da9fc72d395/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb588a27fcacf5a0d04b5f02fb51c7b0b9ba2f101c539581f9956da9fc72d395/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:37 compute-1 podman[81559]: 2025-11-29 06:19:37.713074973 +0000 UTC m=+0.147968258 container init 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Nov 29 06:19:37 compute-1 podman[81559]: 2025-11-29 06:19:37.721385044 +0000 UTC m=+0.156278339 container start 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:19:37 compute-1 podman[81559]: 2025-11-29 06:19:37.726343333 +0000 UTC m=+0.161236608 container attach 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Nov 29 06:19:37 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'iostat'
Nov 29 06:19:38 compute-1 ceph-mgr[81116]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 29 06:19:38 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'k8sevents'
Nov 29 06:19:38 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:38.149+0000 7f8d11e36140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 29 06:19:38 compute-1 romantic_carver[81575]: {
Nov 29 06:19:38 compute-1 romantic_carver[81575]:     "0": [
Nov 29 06:19:38 compute-1 romantic_carver[81575]:         {
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             "devices": [
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "/dev/loop3"
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             ],
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             "lv_name": "ceph_lv0",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             "lv_size": "7511998464",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=336ec58c-893b-528f-a0c1-6ed1196bc047,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=f793b967-de22-4105-bb0d-c91464bf150f,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             "lv_uuid": "caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             "name": "ceph_lv0",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             "path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             "tags": {
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.block_uuid": "caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.cephx_lockbox_secret": "",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.cluster_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.cluster_name": "ceph",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.crush_device_class": "",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.encrypted": "0",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.osd_fsid": "f793b967-de22-4105-bb0d-c91464bf150f",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.osd_id": "0",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.osdspec_affinity": "default_drive_group",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.type": "block",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:                 "ceph.vdo": "0"
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             },
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             "type": "block",
Nov 29 06:19:38 compute-1 romantic_carver[81575]:             "vg_name": "ceph_vg0"
Nov 29 06:19:38 compute-1 romantic_carver[81575]:         }
Nov 29 06:19:38 compute-1 romantic_carver[81575]:     ]
Nov 29 06:19:38 compute-1 romantic_carver[81575]: }
Nov 29 06:19:38 compute-1 systemd[1]: libpod-8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5.scope: Deactivated successfully.
Nov 29 06:19:38 compute-1 podman[81559]: 2025-11-29 06:19:38.456388583 +0000 UTC m=+0.891281858 container died 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:19:38 compute-1 ceph-mon[80754]: from='client.14268 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:19:38 compute-1 ceph-mon[80754]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Nov 29 06:19:38 compute-1 systemd[1]: var-lib-containers-storage-overlay-bb588a27fcacf5a0d04b5f02fb51c7b0b9ba2f101c539581f9956da9fc72d395-merged.mount: Deactivated successfully.
Nov 29 06:19:38 compute-1 podman[81559]: 2025-11-29 06:19:38.523427088 +0000 UTC m=+0.958320343 container remove 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:38 compute-1 systemd[1]: libpod-conmon-8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5.scope: Deactivated successfully.
Nov 29 06:19:38 compute-1 sudo[81452]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:38 compute-1 sudo[81596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:38 compute-1 sudo[81596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:38 compute-1 sudo[81596]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:38 compute-1 sudo[81621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:38 compute-1 sudo[81621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:38 compute-1 sudo[81621]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:38 compute-1 sudo[81646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:38 compute-1 sudo[81646]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:38 compute-1 sudo[81646]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:38 compute-1 sudo[81671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- raw list --format json
Nov 29 06:19:38 compute-1 sudo[81671]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:39 compute-1 podman[81735]: 2025-11-29 06:19:39.19956206 +0000 UTC m=+0.021736876 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:39 compute-1 podman[81735]: 2025-11-29 06:19:39.311109903 +0000 UTC m=+0.133284699 container create ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Nov 29 06:19:39 compute-1 systemd[1]: Started libpod-conmon-ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b.scope.
Nov 29 06:19:39 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:19:39 compute-1 podman[81735]: 2025-11-29 06:19:39.414995874 +0000 UTC m=+0.237170700 container init ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Nov 29 06:19:39 compute-1 podman[81735]: 2025-11-29 06:19:39.425886756 +0000 UTC m=+0.248061542 container start ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:19:39 compute-1 podman[81735]: 2025-11-29 06:19:39.429066465 +0000 UTC m=+0.251241351 container attach ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:19:39 compute-1 serene_diffie[81752]: 167 167
Nov 29 06:19:39 compute-1 systemd[1]: libpod-ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b.scope: Deactivated successfully.
Nov 29 06:19:39 compute-1 podman[81735]: 2025-11-29 06:19:39.433392156 +0000 UTC m=+0.255567012 container died ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-b783fb19fe463c7bd64a6b3c05d5b305fecd2a1637d4528f1aad4d4c1d01992f-merged.mount: Deactivated successfully.
Nov 29 06:19:39 compute-1 podman[81735]: 2025-11-29 06:19:39.493220949 +0000 UTC m=+0.315395745 container remove ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:39 compute-1 systemd[1]: libpod-conmon-ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b.scope: Deactivated successfully.
Nov 29 06:19:39 compute-1 podman[81775]: 2025-11-29 06:19:39.708594262 +0000 UTC m=+0.055861816 container create fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:39 compute-1 systemd[1]: Started libpod-conmon-fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb.scope.
Nov 29 06:19:39 compute-1 podman[81775]: 2025-11-29 06:19:39.678437263 +0000 UTC m=+0.025704877 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:19:39 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:19:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3c2e988dc22fe37205ec4a2e43ff5e9e1f139f8b01c91f53da8011e3eb7e282/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3c2e988dc22fe37205ec4a2e43ff5e9e1f139f8b01c91f53da8011e3eb7e282/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3c2e988dc22fe37205ec4a2e43ff5e9e1f139f8b01c91f53da8011e3eb7e282/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3c2e988dc22fe37205ec4a2e43ff5e9e1f139f8b01c91f53da8011e3eb7e282/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:19:39 compute-1 podman[81775]: 2025-11-29 06:19:39.808394939 +0000 UTC m=+0.155662543 container init fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:39 compute-1 ceph-mon[80754]: pgmap v108: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:39 compute-1 podman[81775]: 2025-11-29 06:19:39.822632965 +0000 UTC m=+0.169900479 container start fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:19:39 compute-1 podman[81775]: 2025-11-29 06:19:39.838377843 +0000 UTC m=+0.185645437 container attach fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:19:39 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'localpool'
Nov 29 06:19:39 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Nov 29 06:19:40 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Nov 29 06:19:40 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'mds_autoscaler'
Nov 29 06:19:40 compute-1 ceph-mon[80754]: 2.9 scrub starts
Nov 29 06:19:40 compute-1 ceph-mon[80754]: 2.9 scrub ok
Nov 29 06:19:40 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/713391435' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Nov 29 06:19:40 compute-1 ceph-mon[80754]: pgmap v109: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:40 compute-1 ceph-mon[80754]: 2.1d scrub starts
Nov 29 06:19:40 compute-1 ceph-mon[80754]: 2.1d scrub ok
Nov 29 06:19:40 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/713391435' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Nov 29 06:19:40 compute-1 zen_thompson[81792]: {
Nov 29 06:19:40 compute-1 zen_thompson[81792]:     "f793b967-de22-4105-bb0d-c91464bf150f": {
Nov 29 06:19:40 compute-1 zen_thompson[81792]:         "ceph_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 06:19:40 compute-1 zen_thompson[81792]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Nov 29 06:19:40 compute-1 zen_thompson[81792]:         "osd_id": 0,
Nov 29 06:19:40 compute-1 zen_thompson[81792]:         "osd_uuid": "f793b967-de22-4105-bb0d-c91464bf150f",
Nov 29 06:19:40 compute-1 zen_thompson[81792]:         "type": "bluestore"
Nov 29 06:19:40 compute-1 zen_thompson[81792]:     }
Nov 29 06:19:40 compute-1 zen_thompson[81792]: }
Nov 29 06:19:41 compute-1 sshd-session[81802]: Invalid user ubuntu from 45.55.249.98 port 57548
Nov 29 06:19:41 compute-1 systemd[1]: libpod-fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb.scope: Deactivated successfully.
Nov 29 06:19:41 compute-1 podman[81775]: 2025-11-29 06:19:41.066536492 +0000 UTC m=+1.413804026 container died fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:19:41 compute-1 systemd[1]: libpod-fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb.scope: Consumed 1.227s CPU time.
Nov 29 06:19:41 compute-1 sshd-session[81802]: Received disconnect from 45.55.249.98 port 57548:11: Bye Bye [preauth]
Nov 29 06:19:41 compute-1 sshd-session[81802]: Disconnected from invalid user ubuntu 45.55.249.98 port 57548 [preauth]
Nov 29 06:19:41 compute-1 systemd[1]: var-lib-containers-storage-overlay-a3c2e988dc22fe37205ec4a2e43ff5e9e1f139f8b01c91f53da8011e3eb7e282-merged.mount: Deactivated successfully.
Nov 29 06:19:41 compute-1 podman[81775]: 2025-11-29 06:19:41.132675492 +0000 UTC m=+1.479943016 container remove fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:19:41 compute-1 systemd[1]: libpod-conmon-fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb.scope: Deactivated successfully.
Nov 29 06:19:41 compute-1 sudo[81671]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:41 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'mirroring'
Nov 29 06:19:41 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'nfs'
Nov 29 06:19:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054711 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:19:42 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:42 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:42 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:42 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:42 compute-1 ceph-mgr[81116]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 29 06:19:42 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'orchestrator'
Nov 29 06:19:42 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:42.252+0000 7f8d11e36140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 29 06:19:42 compute-1 ceph-mgr[81116]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 29 06:19:42 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'osd_perf_query'
Nov 29 06:19:42 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:42.964+0000 7f8d11e36140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 29 06:19:43 compute-1 ceph-mon[80754]: pgmap v110: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:43 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1241390295' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Nov 29 06:19:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Nov 29 06:19:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:19:43 compute-1 ceph-mon[80754]: Deploying daemon osd.2 on compute-2
Nov 29 06:19:43 compute-1 ceph-mon[80754]: 2.1b scrub starts
Nov 29 06:19:43 compute-1 ceph-mgr[81116]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 29 06:19:43 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'osd_support'
Nov 29 06:19:43 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:43.272+0000 7f8d11e36140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 29 06:19:43 compute-1 ceph-mgr[81116]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 29 06:19:43 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:43.564+0000 7f8d11e36140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 29 06:19:43 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'pg_autoscaler'
Nov 29 06:19:43 compute-1 ceph-mgr[81116]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 29 06:19:43 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'progress'
Nov 29 06:19:43 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:43.861+0000 7f8d11e36140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 29 06:19:44 compute-1 ceph-mgr[81116]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 29 06:19:44 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'prometheus'
Nov 29 06:19:44 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:44.108+0000 7f8d11e36140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 29 06:19:44 compute-1 ceph-mon[80754]: 2.1b scrub ok
Nov 29 06:19:44 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/264614796' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 29 06:19:45 compute-1 ceph-mgr[81116]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 29 06:19:45 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'rbd_support'
Nov 29 06:19:45 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:45.140+0000 7f8d11e36140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 29 06:19:45 compute-1 ceph-mon[80754]: pgmap v111: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:45 compute-1 ceph-mgr[81116]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 29 06:19:45 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'restful'
Nov 29 06:19:45 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:45.438+0000 7f8d11e36140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 29 06:19:45 compute-1 sshd-session[81826]: Invalid user testuser from 118.194.230.250 port 50152
Nov 29 06:19:46 compute-1 sshd-session[81826]: Received disconnect from 118.194.230.250 port 50152:11: Bye Bye [preauth]
Nov 29 06:19:46 compute-1 sshd-session[81826]: Disconnected from invalid user testuser 118.194.230.250 port 50152 [preauth]
Nov 29 06:19:46 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'rgw'
Nov 29 06:19:46 compute-1 ceph-mgr[81116]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 29 06:19:46 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'rook'
Nov 29 06:19:46 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:46.921+0000 7f8d11e36140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 29 06:19:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:19:47 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2969688060' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Nov 29 06:19:48 compute-1 ceph-mon[80754]: pgmap v112: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:48 compute-1 ceph-mon[80754]: 2.19 scrub starts
Nov 29 06:19:48 compute-1 ceph-mon[80754]: 2.19 scrub ok
Nov 29 06:19:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:49 compute-1 ceph-mgr[81116]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 29 06:19:49 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'selftest'
Nov 29 06:19:49 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:49.119+0000 7f8d11e36140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 29 06:19:49 compute-1 ceph-mgr[81116]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 29 06:19:49 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'snap_schedule'
Nov 29 06:19:49 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:49.360+0000 7f8d11e36140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 29 06:19:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e32 e32: 3 total, 2 up, 3 in
Nov 29 06:19:49 compute-1 ceph-mon[80754]: pgmap v113: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:49 compute-1 ceph-mon[80754]: from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 29 06:19:49 compute-1 ceph-mon[80754]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 29 06:19:49 compute-1 ceph-mon[80754]: from='client.14301 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 29 06:19:49 compute-1 ceph-mgr[81116]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 29 06:19:49 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:49.609+0000 7f8d11e36140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 29 06:19:49 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'stats'
Nov 29 06:19:49 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'status'
Nov 29 06:19:50 compute-1 ceph-mgr[81116]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 29 06:19:50 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'telegraf'
Nov 29 06:19:50 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:50.134+0000 7f8d11e36140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 29 06:19:50 compute-1 sudo[81828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:50 compute-1 sudo[81828]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:50 compute-1 sudo[81828]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:50 compute-1 sudo[81853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:19:50 compute-1 sudo[81853]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:50 compute-1 sudo[81853]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:50 compute-1 ceph-mgr[81116]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 29 06:19:50 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'telemetry'
Nov 29 06:19:50 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:50.378+0000 7f8d11e36140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 29 06:19:50 compute-1 sudo[81878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:50 compute-1 sudo[81878]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:50 compute-1 sudo[81878]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:50 compute-1 sudo[81903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:50 compute-1 sudo[81903]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:50 compute-1 sudo[81903]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:50 compute-1 ceph-mon[80754]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Nov 29 06:19:50 compute-1 ceph-mon[80754]: osdmap e32: 3 total, 2 up, 3 in
Nov 29 06:19:50 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:50 compute-1 ceph-mon[80754]: from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 29 06:19:50 compute-1 ceph-mon[80754]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 29 06:19:50 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:50 compute-1 ceph-mon[80754]: pgmap v115: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:50 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:50 compute-1 ceph-mon[80754]: Standby manager daemon compute-2.ngsyhe started
Nov 29 06:19:50 compute-1 sudo[81928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:50 compute-1 sudo[81928]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:50 compute-1 sudo[81928]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:50 compute-1 sudo[81953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:19:50 compute-1 sudo[81953]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:51 compute-1 ceph-mgr[81116]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 29 06:19:51 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'test_orchestrator'
Nov 29 06:19:51 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:51.017+0000 7f8d11e36140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 29 06:19:51 compute-1 podman[82051]: 2025-11-29 06:19:51.308914885 +0000 UTC m=+0.119982269 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Nov 29 06:19:51 compute-1 podman[82051]: 2025-11-29 06:19:51.509043133 +0000 UTC m=+0.320110537 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:19:51 compute-1 ceph-mgr[81116]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 29 06:19:51 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'volumes'
Nov 29 06:19:51 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:51.713+0000 7f8d11e36140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 29 06:19:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e33 e33: 3 total, 2 up, 3 in
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.1d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786844254s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623573303s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.1d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786844254s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623573303s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.1c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786513329s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623565674s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.1c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786513329s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623565674s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786314964s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623596191s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786314964s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623596191s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.5( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.785975456s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623611450s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.5( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.785975456s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623611450s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.785957336s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623748779s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.785957336s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623748779s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.12( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786064148s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623954773s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.12( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786064148s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623954773s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.18( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786125183s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.624198914s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:19:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.18( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786125183s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.624198914s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:19:52 compute-1 sudo[81953]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:52 compute-1 ceph-mon[80754]: from='client.14307 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 29 06:19:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:19:52 compute-1 sudo[82140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:52 compute-1 sudo[82140]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:52 compute-1 sudo[82140]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:52 compute-1 sudo[82165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:19:52 compute-1 sudo[82165]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:52 compute-1 sudo[82165]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:52 compute-1 ceph-mgr[81116]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 29 06:19:52 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:52.433+0000 7f8d11e36140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 29 06:19:52 compute-1 ceph-mgr[81116]: mgr[py] Loading python module 'zabbix'
Nov 29 06:19:52 compute-1 sudo[82190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:19:52 compute-1 sudo[82190]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:52 compute-1 sudo[82190]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:52 compute-1 sudo[82215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:19:52 compute-1 sudo[82215]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:19:52 compute-1 ceph-mgr[81116]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 29 06:19:52 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:52.680+0000 7f8d11e36140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 29 06:19:52 compute-1 ceph-mgr[81116]: ms_deliver_dispatch: unhandled message 0x5649d912b1e0 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Nov 29 06:19:52 compute-1 ceph-mgr[81116]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 06:19:53 compute-1 sudo[82215]: pam_unix(sudo:session): session closed for user root
Nov 29 06:19:53 compute-1 ceph-mon[80754]: purged_snaps scrub starts
Nov 29 06:19:53 compute-1 ceph-mon[80754]: purged_snaps scrub ok
Nov 29 06:19:53 compute-1 ceph-mon[80754]: 2.15 scrub starts
Nov 29 06:19:53 compute-1 ceph-mon[80754]: 2.15 scrub ok
Nov 29 06:19:53 compute-1 ceph-mon[80754]: pgmap v116: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:53 compute-1 ceph-mon[80754]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Nov 29 06:19:53 compute-1 ceph-mon[80754]: osdmap e33: 3 total, 2 up, 3 in
Nov 29 06:19:53 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:53 compute-1 ceph-mon[80754]: mgrmap e9: compute-0.vxabpq(active, since 2m), standbys: compute-2.ngsyhe
Nov 29 06:19:53 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:53 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mgr metadata", "who": "compute-2.ngsyhe", "id": "compute-2.ngsyhe"}]: dispatch
Nov 29 06:19:53 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:53 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:53 compute-1 ceph-mon[80754]: from='client.14313 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 29 06:19:53 compute-1 ceph-mon[80754]: Standby manager daemon compute-1.gaxpay started
Nov 29 06:19:53 compute-1 ceph-mgr[81116]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 06:19:54 compute-1 ceph-mgr[81116]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 06:19:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e34 e34: 3 total, 2 up, 3 in
Nov 29 06:19:56 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:56 compute-1 ceph-mon[80754]: pgmap v118: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:56 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:56 compute-1 ceph-mon[80754]: from='client.14319 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 29 06:19:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e34 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:19:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:19:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:58 compute-1 ceph-mon[80754]: mgrmap e10: compute-0.vxabpq(active, since 3m), standbys: compute-2.ngsyhe, compute-1.gaxpay
Nov 29 06:19:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mgr metadata", "who": "compute-1.gaxpay", "id": "compute-1.gaxpay"}]: dispatch
Nov 29 06:19:58 compute-1 ceph-mon[80754]: pgmap v119: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:19:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:58 compute-1 ceph-mon[80754]: osdmap e34: 3 total, 2 up, 3 in
Nov 29 06:19:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:19:58 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/4274267034' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Nov 29 06:19:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:19:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:19:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e35 e35: 3 total, 2 up, 3 in
Nov 29 06:19:59 compute-1 ceph-mon[80754]: pgmap v121: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:19:59 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:19:59 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:01 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e36 e36: 3 total, 2 up, 3 in
Nov 29 06:20:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:20:03 compute-1 ceph-mon[80754]: osdmap e35: 3 total, 2 up, 3 in
Nov 29 06:20:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:20:03 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2162770432' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Nov 29 06:20:03 compute-1 ceph-mon[80754]: pgmap v123: 38 pgs: 38 active+clean; 449 KiB data, 53 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:03 compute-1 ceph-mon[80754]: Health detail: HEALTH_ERR 1 filesystem is offline; 1 filesystem is online with fewer MDS than max_mds
Nov 29 06:20:03 compute-1 ceph-mon[80754]: [ERR] MDS_ALL_DOWN: 1 filesystem is offline
Nov 29 06:20:03 compute-1 ceph-mon[80754]:     fs cephfs is offline because no MDS is active for it.
Nov 29 06:20:03 compute-1 ceph-mon[80754]: [WRN] MDS_UP_LESS_THAN_MAX: 1 filesystem is online with fewer MDS than max_mds
Nov 29 06:20:03 compute-1 ceph-mon[80754]:     fs cephfs has 0 MDS online, but wants 1
Nov 29 06:20:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e37 e37: 3 total, 2 up, 3 in
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:05 compute-1 ceph-mon[80754]: osdmap e36: 3 total, 2 up, 3 in
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Nov 29 06:20:05 compute-1 ceph-mon[80754]: pgmap v125: 100 pgs: 62 unknown, 38 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3618548784' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:05 compute-1 ceph-mon[80754]: osdmap e37: 3 total, 2 up, 3 in
Nov 29 06:20:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:05 compute-1 sudo[82271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:05 compute-1 sudo[82271]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:05 compute-1 sudo[82271]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:05 compute-1 sudo[82296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 29 06:20:05 compute-1 sudo[82296]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:05 compute-1 sudo[82296]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:05 compute-1 sudo[82321]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:05 compute-1 sudo[82321]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:05 compute-1 sudo[82321]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:05 compute-1 sudo[82348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph
Nov 29 06:20:05 compute-1 sudo[82348]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:05 compute-1 sudo[82348]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:05 compute-1 sudo[82373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:05 compute-1 sudo[82373]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:05 compute-1 sudo[82373]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:20:06 compute-1 sudo[82398]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82398]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-1 sudo[82423]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82423]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:20:06 compute-1 sudo[82448]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82448]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sshd-session[82345]: Invalid user ubadmin from 71.70.164.48 port 50600
Nov 29 06:20:06 compute-1 sudo[82473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-1 sshd-session[82345]: Received disconnect from 71.70.164.48 port 50600:11: Bye Bye [preauth]
Nov 29 06:20:06 compute-1 sshd-session[82345]: Disconnected from invalid user ubadmin 71.70.164.48 port 50600 [preauth]
Nov 29 06:20:06 compute-1 sudo[82473]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82473]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:20:06 compute-1 sudo[82498]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82498]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-1 sudo[82546]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82546]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:20:06 compute-1 sudo[82571]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82571]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-1 sudo[82596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82596]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new
Nov 29 06:20:06 compute-1 sudo[82621]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82621]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-1 sudo[82646]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82646]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 29 06:20:06 compute-1 sudo[82671]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82671]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:06 compute-1 sudo[82696]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:06 compute-1 sudo[82696]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:06 compute-1 sudo[82721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:20:07 compute-1 sudo[82721]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-1 sudo[82721]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-1 sudo[82746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:07 compute-1 sudo[82746]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-1 sudo[82746]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-1 sudo[82771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config
Nov 29 06:20:07 compute-1 sudo[82771]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-1 sudo[82771]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-1 sudo[82796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:07 compute-1 sudo[82796]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-1 sudo[82796]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-1 sudo[82821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:20:07 compute-1 sudo[82821]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-1 sudo[82821]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-1 sudo[82846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:07 compute-1 sudo[82846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-1 sudo[82846]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-1 sudo[82871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:20:07 compute-1 sudo[82871]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-1 sudo[82871]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-1 sudo[82896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:07 compute-1 sudo[82896]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-1 sudo[82896]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-1 sudo[82921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:20:07 compute-1 sudo[82921]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-1 sudo[82921]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-1 sudo[82969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:07 compute-1 sudo[82969]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-1 sudo[82969]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:07 compute-1 sudo[82994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:20:07 compute-1 sudo[82994]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:07 compute-1 sudo[82994]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:08 compute-1 sudo[83019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:08 compute-1 sudo[83019]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:08 compute-1 sudo[83019]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:08 compute-1 sudo[83044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new
Nov 29 06:20:08 compute-1 sudo[83044]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:08 compute-1 sudo[83044]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:08 compute-1 sudo[83069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:08 compute-1 sudo[83069]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:08 compute-1 sudo[83069]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:08 compute-1 sudo[83094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-336ec58c-893b-528f-a0c1-6ed1196bc047/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf.new /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:20:08 compute-1 sudo[83094]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:08 compute-1 sudo[83094]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:12 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) log_latency slow operation observed for kv_commit, latency = 7.038947105s
Nov 29 06:20:12 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) log_latency slow operation observed for kv_sync, latency = 7.038947105s
Nov 29 06:20:12 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) log_latency_fn slow operation observed for _txc_committed_kv, latency = 7.039439678s, txc = 0x5566c8a4b800
Nov 29 06:20:13 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e38 e38: 3 total, 2 up, 3 in
Nov 29 06:20:13 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) log_latency_fn slow operation observed for _txc_committed_kv, latency = 7.175556183s, txc = 0x5566c8a53b00
Nov 29 06:20:13 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).paxos(paxos updating c 1..407) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 2.359349251s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Nov 29 06:20:13 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-1[80750]: 2025-11-29T06:20:13.421+0000 7f7b25776640 -1 mon.compute-1@2(peon).paxos(paxos updating c 1..407) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 2.359349251s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Nov 29 06:20:13 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:13 compute-1 ceph-mon[80754]: pgmap v126: 100 pgs: 62 unknown, 38 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:20:13 compute-1 ceph-mon[80754]: 4.1 scrub starts
Nov 29 06:20:13 compute-1 ceph-mon[80754]: 4.1 scrub ok
Nov 29 06:20:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:13 compute-1 ceph-mon[80754]: 4.2 scrub starts
Nov 29 06:20:13 compute-1 ceph-mon[80754]: 4.2 scrub ok
Nov 29 06:20:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Nov 29 06:20:13 compute-1 ceph-mon[80754]: Adjusting osd_memory_target on compute-2 to 128.0M
Nov 29 06:20:13 compute-1 ceph-mon[80754]: Unable to set osd_memory_target on compute-2 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Nov 29 06:20:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:20:13 compute-1 ceph-mon[80754]: Updating compute-0:/etc/ceph/ceph.conf
Nov 29 06:20:13 compute-1 ceph-mon[80754]: Updating compute-1:/etc/ceph/ceph.conf
Nov 29 06:20:13 compute-1 ceph-mon[80754]: Updating compute-2:/etc/ceph/ceph.conf
Nov 29 06:20:13 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3247558833' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Nov 29 06:20:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e39 e39: 3 total, 2 up, 3 in
Nov 29 06:20:16 compute-1 ceph-mon[80754]: pgmap v128: 131 pgs: 62 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:20:16 compute-1 ceph-mon[80754]: osdmap e38: 3 total, 2 up, 3 in
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: 4.3 scrub starts
Nov 29 06:20:16 compute-1 ceph-mon[80754]: 4.3 scrub ok
Nov 29 06:20:16 compute-1 ceph-mon[80754]: Updating compute-0:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:20:16 compute-1 ceph-mon[80754]: Updating compute-1:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:20:16 compute-1 ceph-mon[80754]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: pgmap v130: 131 pgs: 62 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: OSD bench result of 1381.921175 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 29 06:20:16 compute-1 ceph-mon[80754]: pgmap v131: 131 pgs: 62 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: 4.4 scrub starts
Nov 29 06:20:16 compute-1 ceph-mon[80754]: 4.4 scrub ok
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: pgmap v132: 131 pgs: 62 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: 4.5 scrub starts
Nov 29 06:20:16 compute-1 ceph-mon[80754]: 4.5 scrub ok
Nov 29 06:20:16 compute-1 ceph-mon[80754]: 4.6 scrub starts
Nov 29 06:20:16 compute-1 ceph-mon[80754]: 4.6 scrub ok
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 06:20:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e40 e40: 3 total, 3 up, 3 in
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.1d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.1d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[7.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=40 pruub=15.883583069s) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active pruub 117.207687378s@ mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.1c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.5( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.1c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.5( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.18( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.18( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.12( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.12( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:17 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[7.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=40 pruub=15.883583069s) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown pruub 117.207687378s@ mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:19 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:19 compute-1 sshd-session[83119]: Invalid user username from 119.45.242.7 port 37396
Nov 29 06:20:19 compute-1 sshd-session[83119]: Received disconnect from 119.45.242.7 port 37396:11: Bye Bye [preauth]
Nov 29 06:20:19 compute-1 sshd-session[83119]: Disconnected from invalid user username 119.45.242.7 port 37396 [preauth]
Nov 29 06:20:27 compute-1 ceph-mon[80754]: pgmap v133: 131 pgs: 62 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:27 compute-1 ceph-mon[80754]: osdmap e39: 3 total, 2 up, 3 in
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:27 compute-1 ceph-mon[80754]: pgmap v135: 146 pgs: 77 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:27 compute-1 ceph-mon[80754]: osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518] boot
Nov 29 06:20:27 compute-1 ceph-mon[80754]: osdmap e40: 3 total, 3 up, 3 in
Nov 29 06:20:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Nov 29 06:20:27 compute-1 sshd-session[71378]: Received disconnect from 38.102.83.107 port 46652:11: disconnected by user
Nov 29 06:20:27 compute-1 sshd-session[71378]: Disconnected from user zuul 38.102.83.107 port 46652
Nov 29 06:20:27 compute-1 sshd-session[71375]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:20:28 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Nov 29 06:20:28 compute-1 systemd[1]: session-19.scope: Consumed 9.794s CPU time.
Nov 29 06:20:28 compute-1 systemd-logind[785]: Session 19 logged out. Waiting for processes to exit.
Nov 29 06:20:28 compute-1 systemd-logind[785]: Removed session 19.
Nov 29 06:20:28 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).paxos(paxos updating c 1..414) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 4.066800594s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Nov 29 06:20:28 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-1[80750]: 2025-11-29T06:20:28.616+0000 7f7b25776640 -1 mon.compute-1@2(peon).paxos(paxos updating c 1..414) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 4.066800594s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Nov 29 06:20:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1e( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1c( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1a( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.16( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.15( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.10( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.c( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.b( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.8( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.19( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.e( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.d( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.2( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.7( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.5( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.a( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.14( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.11( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.17( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.12( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1d( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:29 compute-1 ceph-mon[80754]: pgmap v137: 177 pgs: 108 unknown, 69 active+clean; 449 KiB data, 54 MiB used, 14 GiB / 14 GiB avail
Nov 29 06:20:29 compute-1 ceph-mon[80754]: 4.7 scrub starts
Nov 29 06:20:29 compute-1 ceph-mon[80754]: 4.7 scrub ok
Nov 29 06:20:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.16( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.15( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.10( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.c( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1a( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.8( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1c( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.19( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.0( empty local-lis/les=40/41 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.2( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.7( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.5( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.a( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.14( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.d( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.11( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.17( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.12( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1d( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:31 compute-1 ceph-mon[80754]: pgmap v138: 177 pgs: 7 peering, 108 unknown, 62 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-1 ceph-mon[80754]: pgmap v139: 177 pgs: 24 peering, 93 unknown, 60 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-1 ceph-mon[80754]: pgmap v140: 177 pgs: 95 peering, 31 unknown, 51 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-1 ceph-mon[80754]: pgmap v141: 177 pgs: 95 peering, 31 unknown, 51 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-1 ceph-mon[80754]: 4.8 deep-scrub starts
Nov 29 06:20:31 compute-1 ceph-mon[80754]: 4.8 deep-scrub ok
Nov 29 06:20:31 compute-1 ceph-mon[80754]: pgmap v142: 177 pgs: 95 peering, 31 unknown, 51 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:20:31 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:31 compute-1 ceph-mon[80754]: osdmap e41: 3 total, 3 up, 3 in
Nov 29 06:20:31 compute-1 ceph-mon[80754]: pgmap v144: 177 pgs: 95 peering, 31 unknown, 51 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:31 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:31 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:20:31 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:20:31 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:32 compute-1 ceph-mon[80754]: 4.9 deep-scrub starts
Nov 29 06:20:32 compute-1 ceph-mon[80754]: 4.9 deep-scrub ok
Nov 29 06:20:32 compute-1 ceph-mon[80754]: 5.c deep-scrub starts
Nov 29 06:20:32 compute-1 ceph-mon[80754]: 5.c deep-scrub ok
Nov 29 06:20:34 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Nov 29 06:20:34 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Nov 29 06:20:34 compute-1 ceph-mon[80754]: pgmap v145: 177 pgs: 78 peering, 99 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:34 compute-1 ceph-mon[80754]: 4.a deep-scrub starts
Nov 29 06:20:34 compute-1 ceph-mon[80754]: 4.a deep-scrub ok
Nov 29 06:20:34 compute-1 ceph-mon[80754]: 5.2 scrub starts
Nov 29 06:20:34 compute-1 ceph-mon[80754]: 5.2 scrub ok
Nov 29 06:20:34 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Nov 29 06:20:35 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.18( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.13( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.c( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.e( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.d( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.a( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.8( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.a( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.5( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.2( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.1b( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.1a( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.d( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.e( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.3( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.5( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.7( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.709385872s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.802291870s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.709323883s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.802291870s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1d( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.709070206s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.802078247s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.709043503s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.802078247s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.11( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708967209s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.802047729s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1d( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.709013939s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.802078247s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708970070s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.802078247s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.11( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708914757s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.802047729s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.14( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708758354s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801986694s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.a( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708691597s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801956177s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.14( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708703041s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801986694s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.a( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708639145s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801956177s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708537102s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801925659s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.5( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708424568s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801849365s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708496094s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801925659s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.5( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708378792s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801849365s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.2( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708154678s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active+scrubbing pruub 129.801681519s@ [ 7.2:  ]  mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708103180s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801712036s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.2( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708094597s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801681519s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707938194s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801605225s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707839966s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801528931s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708559990s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.802261353s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707894325s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801605225s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707805634s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801528931s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708518028s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.802261353s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708050728s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801712036s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707393646s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801391602s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707491875s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801498413s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.8( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707393646s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801437378s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.10( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707167625s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801223755s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707448006s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801498413s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707311630s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801391602s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707350731s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801452637s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.10( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707128525s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801223755s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.8( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707340240s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801437378s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707299232s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801452637s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.16( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.431221962s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.525558472s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.430930138s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.525344849s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.430959702s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.525375366s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.16( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.431178093s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.525558472s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.430882454s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.525344849s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.430913925s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.525375366s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.16( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.10( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.1c( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.14( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.1c( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.1b( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.13( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.1f( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.5( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.d( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.10( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:36 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Nov 29 06:20:36 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Nov 29 06:20:36 compute-1 ceph-mon[80754]: pgmap v146: 177 pgs: 177 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 06:20:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:20:36 compute-1 ceph-mon[80754]: 7.1 scrub starts
Nov 29 06:20:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:20:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:20:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 06:20:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:20:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:20:36 compute-1 ceph-mon[80754]: osdmap e42: 3 total, 3 up, 3 in
Nov 29 06:20:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Nov 29 06:20:36 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.18( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:36 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.13( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.d( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.e( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.a( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.8( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.a( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.5( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.2( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.1b( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.d( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.1a( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.e( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.3( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.5( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.7( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.c( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.1f( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.16( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.11( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.15( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.13( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.10( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.10( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.16( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.7( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.3( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.2( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.a( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.f( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.1b( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.1c( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.5( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.1( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.c( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.d( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.9( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.18( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.14( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.1c( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.f( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:37 compute-1 ceph-mon[80754]: 7.1 scrub ok
Nov 29 06:20:37 compute-1 ceph-mon[80754]: 3.a deep-scrub starts
Nov 29 06:20:37 compute-1 ceph-mon[80754]: 3.a deep-scrub ok
Nov 29 06:20:37 compute-1 ceph-mon[80754]: 7.2 scrub starts
Nov 29 06:20:37 compute-1 ceph-mon[80754]: 4.b scrub starts
Nov 29 06:20:37 compute-1 ceph-mon[80754]: 4.b scrub ok
Nov 29 06:20:37 compute-1 ceph-mon[80754]: 5.4 scrub starts
Nov 29 06:20:37 compute-1 ceph-mon[80754]: 5.4 scrub ok
Nov 29 06:20:37 compute-1 ceph-mon[80754]: pgmap v148: 177 pgs: 14 peering, 163 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:37 compute-1 ceph-mon[80754]: 7.7 scrub starts
Nov 29 06:20:37 compute-1 ceph-mon[80754]: 7.7 scrub ok
Nov 29 06:20:37 compute-1 ceph-mon[80754]: osdmap e43: 3 total, 3 up, 3 in
Nov 29 06:20:38 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Nov 29 06:20:38 compute-1 ceph-mon[80754]: 4.f scrub starts
Nov 29 06:20:38 compute-1 ceph-mon[80754]: 4.f scrub ok
Nov 29 06:20:38 compute-1 ceph-mon[80754]: osdmap e44: 3 total, 3 up, 3 in
Nov 29 06:20:39 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.c scrub starts
Nov 29 06:20:39 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.c scrub ok
Nov 29 06:20:39 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:41 compute-1 ceph-mon[80754]: pgmap v150: 177 pgs: 55 peering, 122 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:41 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:41 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:41 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.pkypgd", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 06:20:41 compute-1 ceph-mon[80754]: 7.c scrub starts
Nov 29 06:20:41 compute-1 ceph-mon[80754]: 7.c scrub ok
Nov 29 06:20:42 compute-1 ceph-mon[80754]: 3.9 scrub starts
Nov 29 06:20:42 compute-1 ceph-mon[80754]: 3.9 scrub ok
Nov 29 06:20:42 compute-1 ceph-mon[80754]: pgmap v152: 177 pgs: 75 peering, 102 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:42 compute-1 ceph-mon[80754]: 4.10 scrub starts
Nov 29 06:20:42 compute-1 ceph-mon[80754]: 4.10 scrub ok
Nov 29 06:20:42 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.pkypgd", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 06:20:42 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:42 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:42 compute-1 ceph-mon[80754]: Deploying daemon rgw.rgw.compute-2.pkypgd on compute-2
Nov 29 06:20:43 compute-1 ceph-mon[80754]: pgmap v153: 177 pgs: 20 peering, 157 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:43 compute-1 ceph-mon[80754]: 4.11 deep-scrub starts
Nov 29 06:20:43 compute-1 ceph-mon[80754]: 4.11 deep-scrub ok
Nov 29 06:20:44 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:45 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.d scrub starts
Nov 29 06:20:45 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.d scrub ok
Nov 29 06:20:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Nov 29 06:20:46 compute-1 ceph-mon[80754]: pgmap v154: 177 pgs: 177 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:47 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.12 scrub starts
Nov 29 06:20:47 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.12 scrub ok
Nov 29 06:20:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Nov 29 06:20:47 compute-1 ceph-mon[80754]: 3.1a scrub starts
Nov 29 06:20:47 compute-1 ceph-mon[80754]: 3.1a scrub ok
Nov 29 06:20:47 compute-1 ceph-mon[80754]: 7.d scrub starts
Nov 29 06:20:47 compute-1 ceph-mon[80754]: 7.d scrub ok
Nov 29 06:20:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:47 compute-1 ceph-mon[80754]: pgmap v155: 177 pgs: 177 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:47 compute-1 ceph-mon[80754]: osdmap e45: 3 total, 3 up, 3 in
Nov 29 06:20:47 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 29 06:20:47 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 29 06:20:48 compute-1 sudo[83122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:48 compute-1 sudo[83122]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:48 compute-1 sudo[83122]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:48 compute-1 sudo[83147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:20:48 compute-1 sudo[83147]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:48 compute-1 sudo[83147]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:48 compute-1 sudo[83172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:20:48 compute-1 sudo[83172]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:48 compute-1 sudo[83172]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:48 compute-1 sudo[83197]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:20:48 compute-1 sudo[83197]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:20:48 compute-1 ceph-mon[80754]: 7.12 scrub starts
Nov 29 06:20:48 compute-1 ceph-mon[80754]: 7.12 scrub ok
Nov 29 06:20:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:48 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Nov 29 06:20:48 compute-1 ceph-mon[80754]: osdmap e46: 3 total, 3 up, 3 in
Nov 29 06:20:48 compute-1 ceph-mon[80754]: pgmap v158: 178 pgs: 1 unknown, 177 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.cbugbv", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 06:20:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.cbugbv", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 06:20:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:48 compute-1 ceph-mon[80754]: Deploying daemon rgw.rgw.compute-1.cbugbv on compute-1
Nov 29 06:20:48 compute-1 podman[83262]: 2025-11-29 06:20:48.904253151 +0000 UTC m=+0.057739438 container create 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 06:20:48 compute-1 systemd[1]: Started libpod-conmon-93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22.scope.
Nov 29 06:20:48 compute-1 podman[83262]: 2025-11-29 06:20:48.876140674 +0000 UTC m=+0.029627151 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:20:48 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:20:49 compute-1 podman[83262]: 2025-11-29 06:20:49.009237166 +0000 UTC m=+0.162723523 container init 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Nov 29 06:20:49 compute-1 podman[83262]: 2025-11-29 06:20:49.019962348 +0000 UTC m=+0.173448655 container start 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True)
Nov 29 06:20:49 compute-1 podman[83262]: 2025-11-29 06:20:49.023497825 +0000 UTC m=+0.176984202 container attach 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:20:49 compute-1 zealous_keller[83278]: 167 167
Nov 29 06:20:49 compute-1 systemd[1]: libpod-93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22.scope: Deactivated successfully.
Nov 29 06:20:49 compute-1 podman[83262]: 2025-11-29 06:20:49.02877882 +0000 UTC m=+0.182265117 container died 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:20:49 compute-1 systemd[1]: var-lib-containers-storage-overlay-09e2c3d39cb101d48c690e8edf1a2653ec2d1e147340ec5325528d76126c4f18-merged.mount: Deactivated successfully.
Nov 29 06:20:49 compute-1 podman[83262]: 2025-11-29 06:20:49.06618452 +0000 UTC m=+0.219670787 container remove 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 29 06:20:49 compute-1 systemd[1]: libpod-conmon-93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22.scope: Deactivated successfully.
Nov 29 06:20:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Nov 29 06:20:49 compute-1 systemd[1]: Reloading.
Nov 29 06:20:49 compute-1 systemd-sysv-generator[83327]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:20:49 compute-1 systemd-rc-local-generator[83323]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:20:49 compute-1 systemd[1]: Reloading.
Nov 29 06:20:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:49 compute-1 systemd-rc-local-generator[83366]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:20:49 compute-1 systemd-sysv-generator[83369]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:20:49 compute-1 systemd[1]: Starting Ceph rgw.rgw.compute-1.cbugbv for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:20:50 compute-1 podman[83423]: 2025-11-29 06:20:49.96794633 +0000 UTC m=+0.026980067 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:20:50 compute-1 podman[83423]: 2025-11-29 06:20:50.317621833 +0000 UTC m=+0.376655580 container create cfaa1f3ba4a2ce0fe5c305ba0458f16f2702133085ca83fb09a233adc862d6bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-1-cbugbv, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:20:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ce5436a4d026d72b60789a4d51f19c1971e0eb434b95c7f850d53f2c2460a58/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ce5436a4d026d72b60789a4d51f19c1971e0eb434b95c7f850d53f2c2460a58/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ce5436a4d026d72b60789a4d51f19c1971e0eb434b95c7f850d53f2c2460a58/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ce5436a4d026d72b60789a4d51f19c1971e0eb434b95c7f850d53f2c2460a58/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.cbugbv supports timestamps until 2038 (0x7fffffff)
Nov 29 06:20:50 compute-1 podman[83423]: 2025-11-29 06:20:50.49810303 +0000 UTC m=+0.557136757 container init cfaa1f3ba4a2ce0fe5c305ba0458f16f2702133085ca83fb09a233adc862d6bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-1-cbugbv, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True)
Nov 29 06:20:50 compute-1 podman[83423]: 2025-11-29 06:20:50.512275786 +0000 UTC m=+0.571309493 container start cfaa1f3ba4a2ce0fe5c305ba0458f16f2702133085ca83fb09a233adc862d6bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-1-cbugbv, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Nov 29 06:20:50 compute-1 ceph-mon[80754]: osdmap e47: 3 total, 3 up, 3 in
Nov 29 06:20:50 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 29 06:20:50 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 29 06:20:50 compute-1 bash[83423]: cfaa1f3ba4a2ce0fe5c305ba0458f16f2702133085ca83fb09a233adc862d6bf
Nov 29 06:20:50 compute-1 systemd[1]: Started Ceph rgw.rgw.compute-1.cbugbv for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:20:50 compute-1 radosgw[83442]: deferred set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:20:50 compute-1 radosgw[83442]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Nov 29 06:20:50 compute-1 radosgw[83442]: framework: beast
Nov 29 06:20:50 compute-1 radosgw[83442]: framework conf key: endpoint, val: 192.168.122.101:8082
Nov 29 06:20:50 compute-1 radosgw[83442]: init_numa not setting numa affinity
Nov 29 06:20:50 compute-1 sudo[83197]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Nov 29 06:20:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Nov 29 06:20:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Nov 29 06:20:51 compute-1 ceph-mon[80754]: pgmap v160: 179 pgs: 2 unknown, 177 active+clean; 449 KiB data, 80 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:20:51 compute-1 ceph-mon[80754]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:20:51 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Nov 29 06:20:51 compute-1 ceph-mon[80754]: osdmap e48: 3 total, 3 up, 3 in
Nov 29 06:20:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:51 compute-1 ceph-mon[80754]: 7.15 scrub starts
Nov 29 06:20:51 compute-1 ceph-mon[80754]: 7.15 scrub ok
Nov 29 06:20:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vmptkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 06:20:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vmptkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 06:20:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Nov 29 06:20:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Nov 29 06:20:52 compute-1 ceph-mon[80754]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1253186838' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 06:20:52 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.17 deep-scrub starts
Nov 29 06:20:52 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 49 pg[10.0( empty local-lis/les=0/0 n=0 ec=49/49 lis/c=0/0 les/c/f=0/0/0 sis=49) [0] r=0 lpr=49 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:20:52 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.17 deep-scrub ok
Nov 29 06:20:52 compute-1 ceph-mon[80754]: Deploying daemon rgw.rgw.compute-0.vmptkp on compute-0
Nov 29 06:20:52 compute-1 ceph-mon[80754]: 4.12 scrub starts
Nov 29 06:20:52 compute-1 ceph-mon[80754]: 4.12 scrub ok
Nov 29 06:20:52 compute-1 ceph-mon[80754]: pgmap v162: 179 pgs: 1 creating+peering, 178 active+clean; 450 KiB data, 80 MiB used, 21 GiB / 21 GiB avail; 705 B/s rd, 705 B/s wr, 1 op/s
Nov 29 06:20:52 compute-1 ceph-mon[80754]: osdmap e49: 3 total, 3 up, 3 in
Nov 29 06:20:52 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 06:20:52 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 06:20:52 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1253186838' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 06:20:52 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 06:20:52 compute-1 ceph-mon[80754]: 7.17 deep-scrub starts
Nov 29 06:20:52 compute-1 ceph-mon[80754]: 7.17 deep-scrub ok
Nov 29 06:20:53 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Nov 29 06:20:53 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 50 pg[10.0( empty local-lis/les=49/50 n=0 ec=49/49 lis/c=0/0 les/c/f=0/0/0 sis=49) [0] r=0 lpr=49 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:20:54 compute-1 ceph-mon[80754]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Nov 29 06:20:54 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Nov 29 06:20:54 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Nov 29 06:20:54 compute-1 ceph-mon[80754]: osdmap e50: 3 total, 3 up, 3 in
Nov 29 06:20:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:20:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Nov 29 06:20:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Nov 29 06:20:54 compute-1 ceph-mon[80754]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:56 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Nov 29 06:20:56 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Nov 29 06:20:56 compute-1 sshd-session[83513]: Invalid user 1 from 45.55.249.98 port 43110
Nov 29 06:20:56 compute-1 sshd-session[83513]: Received disconnect from 45.55.249.98 port 43110:11: Bye Bye [preauth]
Nov 29 06:20:56 compute-1 sshd-session[83513]: Disconnected from invalid user 1 45.55.249.98 port 43110 [preauth]
Nov 29 06:20:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Nov 29 06:20:57 compute-1 ceph-mon[80754]: 4.16 scrub starts
Nov 29 06:20:57 compute-1 ceph-mon[80754]: 4.16 scrub ok
Nov 29 06:20:57 compute-1 ceph-mon[80754]: pgmap v165: 180 pgs: 1 unknown, 1 creating+peering, 178 active+clean; 450 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 841 B/s rd, 841 B/s wr, 1 op/s
Nov 29 06:20:57 compute-1 ceph-mon[80754]: 4.17 scrub starts
Nov 29 06:20:57 compute-1 ceph-mon[80754]: 4.17 scrub ok
Nov 29 06:20:57 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:57 compute-1 ceph-mon[80754]: osdmap e51: 3 total, 3 up, 3 in
Nov 29 06:20:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:57 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Nov 29 06:20:57 compute-1 ceph-mon[80754]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:20:58 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1a deep-scrub starts
Nov 29 06:20:58 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1a deep-scrub ok
Nov 29 06:20:58 compute-1 ceph-mon[80754]: 5.e deep-scrub starts
Nov 29 06:20:58 compute-1 ceph-mon[80754]: pgmap v167: 181 pgs: 1 creating+peering, 1 unknown, 179 active+clean; 450 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 3.3 KiB/s rd, 402 B/s wr, 4 op/s
Nov 29 06:20:58 compute-1 ceph-mon[80754]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 06:20:58 compute-1 ceph-mon[80754]: 7.19 scrub starts
Nov 29 06:20:58 compute-1 ceph-mon[80754]: 7.19 scrub ok
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:58 compute-1 ceph-mon[80754]: 4.1e scrub starts
Nov 29 06:20:58 compute-1 ceph-mon[80754]: 4.1e scrub ok
Nov 29 06:20:58 compute-1 ceph-mon[80754]: 5.e deep-scrub ok
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 06:20:58 compute-1 ceph-mon[80754]: osdmap e52: 3 total, 3 up, 3 in
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:58 compute-1 ceph-mon[80754]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.gxdwyy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.gxdwyy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:20:58 compute-1 ceph-mon[80754]: Deploying daemon mds.cephfs.compute-2.gxdwyy on compute-2
Nov 29 06:20:58 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:20:58 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Nov 29 06:20:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e3 new map
Nov 29 06:21:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e3 print_map
                                           e3
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:19:35.589013+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-2.gxdwyy{-1:24145} state up:standby seq 1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Nov 29 06:21:00 compute-1 ceph-mon[80754]: pgmap v169: 181 pgs: 1 creating+peering, 180 active+clean; 450 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 2.9 KiB/s rd, 346 B/s wr, 4 op/s
Nov 29 06:21:00 compute-1 ceph-mon[80754]: 7.1a deep-scrub starts
Nov 29 06:21:00 compute-1 ceph-mon[80754]: 7.1a deep-scrub ok
Nov 29 06:21:00 compute-1 ceph-mon[80754]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Nov 29 06:21:00 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 06:21:00 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 06:21:00 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 06:21:00 compute-1 ceph-mon[80754]: osdmap e53: 3 total, 3 up, 3 in
Nov 29 06:21:00 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:21:00 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 06:21:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e4 new map
Nov 29 06:21:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e4 print_map
                                           e4
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:00.645745+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:creating seq 1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
Nov 29 06:21:00 compute-1 radosgw[83442]: LDAP not started since no server URIs were provided in the configuration.
Nov 29 06:21:00 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-1-cbugbv[83438]: 2025-11-29T06:21:00.840+0000 7f2e761db940 -1 LDAP not started since no server URIs were provided in the configuration.
Nov 29 06:21:00 compute-1 radosgw[83442]: framework: beast
Nov 29 06:21:00 compute-1 radosgw[83442]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Nov 29 06:21:00 compute-1 radosgw[83442]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Nov 29 06:21:00 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Nov 29 06:21:00 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Nov 29 06:21:00 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Nov 29 06:21:00 compute-1 radosgw[83442]: starting handler: beast
Nov 29 06:21:00 compute-1 radosgw[83442]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:21:00 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Nov 29 06:21:00 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Nov 29 06:21:00 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Nov 29 06:21:01 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Nov 29 06:21:01 compute-1 radosgw[83442]: mgrc service_daemon_register rgw.24113 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.cbugbv,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=916ce3c8-b215-47fd-909b-03c5b552b52f,zone_name=default,zonegroup_id=a7fe8251-a74c-4f06-a680-d530d14bb192,zonegroup_name=default}
Nov 29 06:21:01 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Nov 29 06:21:02 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Nov 29 06:21:02 compute-1 sshd-session[84058]: Invalid user janice from 118.194.230.250 port 50258
Nov 29 06:21:02 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:02 compute-1 ceph-mon[80754]: pgmap v171: 181 pgs: 1 creating+peering, 180 active+clean; 450 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 2.8 KiB/s rd, 341 B/s wr, 3 op/s
Nov 29 06:21:02 compute-1 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 06:21:02 compute-1 ceph-mon[80754]: osdmap e54: 3 total, 3 up, 3 in
Nov 29 06:21:02 compute-1 ceph-mon[80754]: mds.? [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] up:boot
Nov 29 06:21:02 compute-1 ceph-mon[80754]: daemon mds.cephfs.compute-2.gxdwyy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Nov 29 06:21:02 compute-1 ceph-mon[80754]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Nov 29 06:21:02 compute-1 ceph-mon[80754]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Nov 29 06:21:02 compute-1 ceph-mon[80754]: Cluster is now healthy
Nov 29 06:21:02 compute-1 ceph-mon[80754]: fsmap cephfs:0 1 up:standby
Nov 29 06:21:02 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.gxdwyy"}]: dispatch
Nov 29 06:21:02 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:02 compute-1 ceph-mon[80754]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:creating}
Nov 29 06:21:02 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:02 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jzycnf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 06:21:02 compute-1 ceph-mon[80754]: daemon mds.cephfs.compute-2.gxdwyy is now active in filesystem cephfs as rank 0
Nov 29 06:21:02 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jzycnf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 06:21:02 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:21:02 compute-1 ceph-mon[80754]: Deploying daemon mds.cephfs.compute-0.jzycnf on compute-0
Nov 29 06:21:02 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Nov 29 06:21:02 compute-1 sshd-session[84058]: Received disconnect from 118.194.230.250 port 50258:11: Bye Bye [preauth]
Nov 29 06:21:02 compute-1 sshd-session[84058]: Disconnected from invalid user janice 118.194.230.250 port 50258 [preauth]
Nov 29 06:21:03 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Nov 29 06:21:03 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Nov 29 06:21:03 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e5 new map
Nov 29 06:21:03 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e5 print_map
                                           e5
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:01.949294+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
Nov 29 06:21:04 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Nov 29 06:21:04 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Nov 29 06:21:04 compute-1 ceph-mon[80754]: pgmap v173: 181 pgs: 181 active+clean; 452 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 1.2 KiB/s rd, 3.8 KiB/s wr, 13 op/s
Nov 29 06:21:04 compute-1 ceph-mon[80754]: 7.1c scrub starts
Nov 29 06:21:04 compute-1 ceph-mon[80754]: 7.1c scrub ok
Nov 29 06:21:04 compute-1 ceph-mon[80754]: 4.18 scrub starts
Nov 29 06:21:04 compute-1 ceph-mon[80754]: 4.18 scrub ok
Nov 29 06:21:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e6 new map
Nov 29 06:21:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e6 print_map
                                           e6
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:01.949294+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:05 compute-1 ceph-mon[80754]: mds.? [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] up:active
Nov 29 06:21:05 compute-1 ceph-mon[80754]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active}
Nov 29 06:21:05 compute-1 ceph-mon[80754]: 6.4 scrub starts
Nov 29 06:21:05 compute-1 ceph-mon[80754]: 6.4 scrub ok
Nov 29 06:21:05 compute-1 ceph-mon[80754]: pgmap v174: 181 pgs: 181 active+clean; 452 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 1.0 KiB/s rd, 3.3 KiB/s wr, 11 op/s
Nov 29 06:21:05 compute-1 ceph-mon[80754]: 4.13 scrub starts
Nov 29 06:21:05 compute-1 ceph-mon[80754]: 4.13 scrub ok
Nov 29 06:21:05 compute-1 ceph-mon[80754]: 6.6 scrub starts
Nov 29 06:21:05 compute-1 ceph-mon[80754]: 6.6 scrub ok
Nov 29 06:21:06 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.c scrub starts
Nov 29 06:21:06 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.c scrub ok
Nov 29 06:21:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e7 new map
Nov 29 06:21:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e7 print_map
                                           e7
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:01.949294+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:06 compute-1 sudo[84060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:21:06 compute-1 sudo[84060]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:06 compute-1 sudo[84060]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:06 compute-1 sudo[84085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:21:06 compute-1 sudo[84085]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:06 compute-1 sudo[84085]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:06 compute-1 sudo[84110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:21:06 compute-1 sudo[84110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:06 compute-1 sudo[84110]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:06 compute-1 sudo[84135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:21:06 compute-1 sudo[84135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:21:07 compute-1 podman[84201]: 2025-11-29 06:21:07.004227 +0000 UTC m=+0.041254897 container create cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:21:07 compute-1 systemd[1]: Started libpod-conmon-cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9.scope.
Nov 29 06:21:07 compute-1 podman[84201]: 2025-11-29 06:21:06.985016985 +0000 UTC m=+0.022044902 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:21:07 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:21:07 compute-1 podman[84201]: 2025-11-29 06:21:07.100662522 +0000 UTC m=+0.137690439 container init cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Nov 29 06:21:07 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Nov 29 06:21:07 compute-1 podman[84201]: 2025-11-29 06:21:07.110086918 +0000 UTC m=+0.147114815 container start cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:21:07 compute-1 podman[84201]: 2025-11-29 06:21:07.115500326 +0000 UTC m=+0.152528243 container attach cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:21:07 compute-1 heuristic_ganguly[84216]: 167 167
Nov 29 06:21:07 compute-1 systemd[1]: libpod-cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9.scope: Deactivated successfully.
Nov 29 06:21:07 compute-1 podman[84201]: 2025-11-29 06:21:07.121767578 +0000 UTC m=+0.158795475 container died cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Nov 29 06:21:07 compute-1 ceph-mon[80754]: 5.12 deep-scrub starts
Nov 29 06:21:07 compute-1 ceph-mon[80754]: 5.12 deep-scrub ok
Nov 29 06:21:07 compute-1 ceph-mon[80754]: pgmap v175: 181 pgs: 181 active+clean; 456 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 122 KiB/s rd, 5.6 KiB/s wr, 219 op/s
Nov 29 06:21:07 compute-1 ceph-mon[80754]: 4.c scrub starts
Nov 29 06:21:07 compute-1 ceph-mon[80754]: 4.c scrub ok
Nov 29 06:21:07 compute-1 ceph-mon[80754]: mds.? [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] up:boot
Nov 29 06:21:07 compute-1 ceph-mon[80754]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 1 up:standby
Nov 29 06:21:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.jzycnf"}]: dispatch
Nov 29 06:21:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:21:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:07 compute-1 ceph-mon[80754]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 1 up:standby
Nov 29 06:21:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.vlqnad", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 06:21:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.vlqnad", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 06:21:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:21:07 compute-1 ceph-mon[80754]: Deploying daemon mds.cephfs.compute-1.vlqnad on compute-1
Nov 29 06:21:07 compute-1 ceph-mon[80754]: 6.9 deep-scrub starts
Nov 29 06:21:07 compute-1 ceph-mon[80754]: 6.9 deep-scrub ok
Nov 29 06:21:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-560408556d2562ba69fedfaee23ef6fa418e09e6722fa23dca860156a1f80608-merged.mount: Deactivated successfully.
Nov 29 06:21:07 compute-1 podman[84201]: 2025-11-29 06:21:07.168921274 +0000 UTC m=+0.205949171 container remove cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3)
Nov 29 06:21:07 compute-1 systemd[1]: libpod-conmon-cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9.scope: Deactivated successfully.
Nov 29 06:21:07 compute-1 systemd[1]: Reloading.
Nov 29 06:21:07 compute-1 systemd-rc-local-generator[84264]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:21:07 compute-1 systemd-sysv-generator[84267]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:21:08 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Nov 29 06:21:08 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.e scrub starts
Nov 29 06:21:08 compute-1 systemd[1]: Reloading.
Nov 29 06:21:08 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.e scrub ok
Nov 29 06:21:08 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:21:08 compute-1 ceph-mon[80754]: osdmap e55: 3 total, 3 up, 3 in
Nov 29 06:21:08 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:21:08 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:08 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:08 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:21:08 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:08 compute-1 ceph-mon[80754]: osdmap e56: 3 total, 3 up, 3 in
Nov 29 06:21:08 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:21:08 compute-1 systemd-rc-local-generator[84302]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:21:08 compute-1 systemd-sysv-generator[84307]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:21:08 compute-1 systemd[1]: Starting Ceph mds.cephfs.compute-1.vlqnad for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 06:21:08 compute-1 podman[84364]: 2025-11-29 06:21:08.716535702 +0000 UTC m=+0.040291981 container create 08cd4b182b0bede8363cef69388c2d99ea69ceb89d1599b75d6802dac432aaae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-1-vlqnad, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Nov 29 06:21:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7178d6186c4b6b3a979c787166746a9f5bcffadb4a7d2f848b3ba10715393dfa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:21:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7178d6186c4b6b3a979c787166746a9f5bcffadb4a7d2f848b3ba10715393dfa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:21:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7178d6186c4b6b3a979c787166746a9f5bcffadb4a7d2f848b3ba10715393dfa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:21:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7178d6186c4b6b3a979c787166746a9f5bcffadb4a7d2f848b3ba10715393dfa/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.vlqnad supports timestamps until 2038 (0x7fffffff)
Nov 29 06:21:08 compute-1 sshd-session[84312]: Invalid user cesar from 66.94.122.234 port 46686
Nov 29 06:21:08 compute-1 podman[84364]: 2025-11-29 06:21:08.78975001 +0000 UTC m=+0.113506319 container init 08cd4b182b0bede8363cef69388c2d99ea69ceb89d1599b75d6802dac432aaae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-1-vlqnad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:21:08 compute-1 podman[84364]: 2025-11-29 06:21:08.698717575 +0000 UTC m=+0.022473894 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:21:08 compute-1 podman[84364]: 2025-11-29 06:21:08.79598793 +0000 UTC m=+0.119744219 container start 08cd4b182b0bede8363cef69388c2d99ea69ceb89d1599b75d6802dac432aaae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-1-vlqnad, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Nov 29 06:21:08 compute-1 bash[84364]: 08cd4b182b0bede8363cef69388c2d99ea69ceb89d1599b75d6802dac432aaae
Nov 29 06:21:08 compute-1 systemd[1]: Started Ceph mds.cephfs.compute-1.vlqnad for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 06:21:08 compute-1 sshd-session[84312]: Received disconnect from 66.94.122.234 port 46686:11: Bye Bye [preauth]
Nov 29 06:21:08 compute-1 sshd-session[84312]: Disconnected from invalid user cesar 66.94.122.234 port 46686 [preauth]
Nov 29 06:21:08 compute-1 sudo[84135]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:08 compute-1 ceph-mds[84384]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 06:21:08 compute-1 ceph-mds[84384]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Nov 29 06:21:08 compute-1 ceph-mds[84384]: main not setting numa affinity
Nov 29 06:21:08 compute-1 ceph-mds[84384]: pidfile_write: ignore empty --pid-file
Nov 29 06:21:08 compute-1 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-1-vlqnad[84380]: starting mds.cephfs.compute-1.vlqnad at 
Nov 29 06:21:09 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:10 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Updating MDS map to version 7 from mon.2
Nov 29 06:21:11 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Nov 29 06:21:11 compute-1 ceph-mon[80754]: pgmap v177: 181 pgs: 181 active+clean; 456 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 122 KiB/s rd, 5.6 KiB/s wr, 219 op/s
Nov 29 06:21:11 compute-1 ceph-mon[80754]: 6.e scrub starts
Nov 29 06:21:11 compute-1 ceph-mon[80754]: 6.e scrub ok
Nov 29 06:21:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e8 new map
Nov 29 06:21:12 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Updating MDS map to version 8 from mon.2
Nov 29 06:21:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e8 print_map
                                           e8
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:01.949294+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
                                           [mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:12 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Monitors have assigned me to become a standby.
Nov 29 06:21:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Nov 29 06:21:12 compute-1 ceph-mon[80754]: 2.13 scrub starts
Nov 29 06:21:12 compute-1 ceph-mon[80754]: 2.13 scrub ok
Nov 29 06:21:12 compute-1 ceph-mon[80754]: 6.b scrub starts
Nov 29 06:21:12 compute-1 ceph-mon[80754]: 6.b scrub ok
Nov 29 06:21:12 compute-1 ceph-mon[80754]: pgmap v179: 212 pgs: 31 unknown, 181 active+clean; 456 KiB data, 81 MiB used, 21 GiB / 21 GiB avail; 121 KiB/s rd, 2.7 KiB/s wr, 209 op/s
Nov 29 06:21:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:21:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:12 compute-1 ceph-mon[80754]: osdmap e57: 3 total, 3 up, 3 in
Nov 29 06:21:12 compute-1 ceph-mon[80754]: 6.c scrub starts
Nov 29 06:21:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 06:21:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:13 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 58 pg[10.0( v 54'96 (0'0,54'96] local-lis/les=49/50 n=8 ec=49/49 lis/c=49/49 les/c/f=50/50/0 sis=58 pruub=11.911335945s) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 54'95 mlcod 54'95 active pruub 169.060745239s@ mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:13 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 58 pg[10.0( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=49/49 lis/c=49/49 les/c/f=50/50/0 sis=58 pruub=11.911335945s) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 54'95 mlcod 0'0 unknown pruub 169.060745239s@ mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e58 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Nov 29 06:21:14 compute-1 ceph-mon[80754]: pgmap v181: 212 pgs: 1 peering, 31 unknown, 180 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 198 op/s
Nov 29 06:21:14 compute-1 ceph-mon[80754]: 6.c scrub ok
Nov 29 06:21:14 compute-1 ceph-mon[80754]: 6.f scrub starts
Nov 29 06:21:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Nov 29 06:21:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:14 compute-1 ceph-mon[80754]: 6.f scrub ok
Nov 29 06:21:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:14 compute-1 ceph-mon[80754]: mds.? [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] up:boot
Nov 29 06:21:14 compute-1 ceph-mon[80754]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:21:14 compute-1 ceph-mon[80754]: osdmap e58: 3 total, 3 up, 3 in
Nov 29 06:21:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.vlqnad"}]: dispatch
Nov 29 06:21:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1b( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1f( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.10( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.12( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.7( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1c( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1a( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.19( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.b( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1e( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.a( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.8( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.f( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.d( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.14( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.3( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.15( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.c( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.e( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.9( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.4( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.5( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1( v 54'96 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.6( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.2( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1d( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.17( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.16( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.11( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.18( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.13( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1b( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.7( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.10( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.19( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.b( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.12( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1f( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1c( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1e( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.a( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.8( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.14( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.f( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.15( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.3( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.c( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.d( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.0( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=49/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 54'95 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.e( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.9( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.4( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.5( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1d( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.2( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.17( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.11( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.16( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1a( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.6( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.18( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.13( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:14 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:16 compute-1 ceph-mon[80754]: 3.11 scrub starts
Nov 29 06:21:16 compute-1 ceph-mon[80754]: 3.11 scrub ok
Nov 29 06:21:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:16 compute-1 ceph-mon[80754]: pgmap v183: 274 pgs: 1 peering, 93 unknown, 180 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 198 op/s
Nov 29 06:21:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:16 compute-1 ceph-mon[80754]: osdmap e59: 3 total, 3 up, 3 in
Nov 29 06:21:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:16 compute-1 ceph-mon[80754]: Deploying daemon haproxy.rgw.default.compute-0.zzbnoj on compute-0
Nov 29 06:21:16 compute-1 ceph-mon[80754]: 3.8 scrub starts
Nov 29 06:21:16 compute-1 ceph-mon[80754]: 3.8 scrub ok
Nov 29 06:21:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Nov 29 06:21:17 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Nov 29 06:21:17 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Nov 29 06:21:17 compute-1 ceph-mon[80754]: 7.6 scrub starts
Nov 29 06:21:17 compute-1 ceph-mon[80754]: 7.6 scrub ok
Nov 29 06:21:17 compute-1 ceph-mon[80754]: pgmap v185: 274 pgs: 31 unknown, 243 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 196 op/s
Nov 29 06:21:17 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:17 compute-1 ceph-mon[80754]: 3.0 scrub starts
Nov 29 06:21:17 compute-1 ceph-mon[80754]: 3.0 scrub ok
Nov 29 06:21:17 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:17 compute-1 ceph-mon[80754]: osdmap e60: 3 total, 3 up, 3 in
Nov 29 06:21:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e9 new map
Nov 29 06:21:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e9 print_map
                                           e9
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        9
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:17.214295+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 6 join_fscid=1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 4 join_fscid=1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
                                           [mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:18 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Nov 29 06:21:18 compute-1 ceph-mon[80754]: 6.8 scrub starts
Nov 29 06:21:18 compute-1 ceph-mon[80754]: 6.8 scrub ok
Nov 29 06:21:18 compute-1 ceph-mon[80754]: mds.? [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] up:standby
Nov 29 06:21:18 compute-1 ceph-mon[80754]: mds.? [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] up:active
Nov 29 06:21:18 compute-1 ceph-mon[80754]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:21:18 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 06:21:18 compute-1 ceph-mon[80754]: osdmap e61: 3 total, 3 up, 3 in
Nov 29 06:21:19 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:20 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.d scrub starts
Nov 29 06:21:20 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.d scrub ok
Nov 29 06:21:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e10 new map
Nov 29 06:21:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).mds e10 print_map
                                           e10
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        9
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-11-29T06:19:35.588785+0000
                                           modified        2025-11-29T06:21:17.214295+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}
                                           max_mds        1
                                           in        0
                                           up        {0=24145}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           [mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 6 join_fscid=1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 4 join_fscid=1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
                                           [mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 3 join_fscid=1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 06:21:21 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Updating MDS map to version 10 from mon.2
Nov 29 06:21:21 compute-1 ceph-mon[80754]: pgmap v187: 305 pgs: 31 unknown, 274 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:21 compute-1 ceph-mon[80754]: 7.13 scrub starts
Nov 29 06:21:21 compute-1 ceph-mon[80754]: 7.13 scrub ok
Nov 29 06:21:21 compute-1 ceph-mon[80754]: 7.3 scrub starts
Nov 29 06:21:21 compute-1 ceph-mon[80754]: 7.3 scrub ok
Nov 29 06:21:22 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.a scrub starts
Nov 29 06:21:22 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.a scrub ok
Nov 29 06:21:23 compute-1 ceph-mon[80754]: pgmap v189: 305 pgs: 31 unknown, 274 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:23 compute-1 ceph-mon[80754]: 6.d scrub starts
Nov 29 06:21:23 compute-1 ceph-mon[80754]: 6.d scrub ok
Nov 29 06:21:23 compute-1 ceph-mon[80754]: 5.0 deep-scrub starts
Nov 29 06:21:23 compute-1 ceph-mon[80754]: 5.0 deep-scrub ok
Nov 29 06:21:23 compute-1 ceph-mon[80754]: 7.18 scrub starts
Nov 29 06:21:23 compute-1 ceph-mon[80754]: 7.18 scrub ok
Nov 29 06:21:23 compute-1 ceph-mon[80754]: mds.? [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] up:standby
Nov 29 06:21:23 compute-1 ceph-mon[80754]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:21:23 compute-1 ceph-mon[80754]: pgmap v190: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Nov 29 06:21:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:23 compute-1 ceph-mon[80754]: 6.a scrub starts
Nov 29 06:21:23 compute-1 ceph-mon[80754]: 6.a scrub ok
Nov 29 06:21:24 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1b( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.569214821s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.522583008s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.12( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.575281143s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528656006s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.12( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.575191498s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528656006s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1b( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.569099426s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.522583008s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.10( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574449539s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528198242s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1e( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574452400s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528381348s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.10( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574231148s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528198242s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1e( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574331284s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528381348s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.19( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574158669s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528259277s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.19( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574132919s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528259277s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.8( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574046135s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528350830s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.8( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574021339s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528350830s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.f( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573841095s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528411865s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.15( v 59'99 (0'0,59'99] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573786736s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 59'98 active pruub 182.528396606s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.14( v 59'99 (0'0,59'99] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573767662s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 59'98 active pruub 182.528396606s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.f( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573729515s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528411865s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.3( v 59'99 (0'0,59'99] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573698044s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 59'98 active pruub 182.528411865s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.14( v 59'99 (0'0,59'99] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573683739s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 0'0 unknown NOTIFY pruub 182.528396606s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.3( v 59'99 (0'0,59'99] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573577881s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 0'0 unknown NOTIFY pruub 182.528411865s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.15( v 59'99 (0'0,59'99] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573624611s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 0'0 unknown NOTIFY pruub 182.528396606s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.576788902s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.532211304s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.4( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572895050s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528457642s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.576555252s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.532211304s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.2( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572846413s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528533936s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.4( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572806358s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528457642s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.18( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.576239586s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.532165527s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.11( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572647095s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528610229s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.2( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572795868s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528533936s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.11( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572585106s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528610229s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.18( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.576163292s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.532165527s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.5( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572252274s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528533936s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.13( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.575842857s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.532211304s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.13( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.575673103s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.532211304s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.5( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572030067s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528533936s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.17( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.14( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.10( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.12( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1e( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.1b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.7( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.4( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1d( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.4( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.f( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.5( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.8( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.14( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1c( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.19( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1a( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.12( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1b( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.18( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:21:24 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.003000079s ======
Nov 29 06:21:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:25.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000079s
Nov 29 06:21:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Nov 29 06:21:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:21:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Nov 29 06:21:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:25 compute-1 ceph-mon[80754]: osdmap e62: 3 total, 3 up, 3 in
Nov 29 06:21:25 compute-1 sshd-session[84404]: Accepted publickey for zuul from 192.168.122.30 port 44948 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:21:25 compute-1 systemd-logind[785]: New session 33 of user zuul.
Nov 29 06:21:25 compute-1 systemd[1]: Started Session 33 of User zuul.
Nov 29 06:21:25 compute-1 sshd-session[84404]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:21:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Nov 29 06:21:25 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.14( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:25 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.12( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.7( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.f( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1c( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.5( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.4( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1( v 54'2 (0'0,54'2] local-lis/les=62/63 n=1 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.14( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.1b( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1b( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.18( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.8( v 46'4 lc 0'0 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.12( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1e( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.10( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.4( v 46'4 (0'0,46'4] local-lis/les=62/63 n=1 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.19( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1d( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1a( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.17( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:21:26 compute-1 python3.9[84557]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:21:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:27.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:28 compute-1 sudo[84769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqhaaezetehuevrjaeiepzpmqyujajun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397288.1715126-62-31301665855546/AnsiballZ_command.py'
Nov 29 06:21:28 compute-1 sudo[84769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:28 compute-1 python3.9[84771]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:21:29 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Nov 29 06:21:29 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Nov 29 06:21:29 compute-1 ceph-mon[80754]: 7.4 scrub starts
Nov 29 06:21:29 compute-1 ceph-mon[80754]: 7.4 scrub ok
Nov 29 06:21:29 compute-1 ceph-mon[80754]: pgmap v191: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:29 compute-1 ceph-mon[80754]: 5.d deep-scrub starts
Nov 29 06:21:29 compute-1 ceph-mon[80754]: 5.d deep-scrub ok
Nov 29 06:21:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Nov 29 06:21:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:21:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:29 compute-1 ceph-mon[80754]: osdmap e63: 3 total, 3 up, 3 in
Nov 29 06:21:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:29 compute-1 ceph-mon[80754]: 5.b scrub starts
Nov 29 06:21:29 compute-1 ceph-mon[80754]: 5.b scrub ok
Nov 29 06:21:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Nov 29 06:21:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:29.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:29 compute-1 ceph-mon[80754]: pgmap v194: 305 pgs: 9 peering, 296 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:29 compute-1 ceph-mon[80754]: Deploying daemon haproxy.rgw.default.compute-2.lpqgfx on compute-2
Nov 29 06:21:29 compute-1 ceph-mon[80754]: pgmap v195: 305 pgs: 9 peering, 296 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 109 B/s, 0 objects/s recovering
Nov 29 06:21:29 compute-1 ceph-mon[80754]: 7.e scrub starts
Nov 29 06:21:29 compute-1 ceph-mon[80754]: 7.e scrub ok
Nov 29 06:21:29 compute-1 ceph-mon[80754]: 6.2 scrub starts
Nov 29 06:21:29 compute-1 ceph-mon[80754]: 6.2 scrub ok
Nov 29 06:21:29 compute-1 ceph-mon[80754]: osdmap e64: 3 total, 3 up, 3 in
Nov 29 06:21:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:31.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:31 compute-1 ceph-mon[80754]: 7.9 scrub starts
Nov 29 06:21:31 compute-1 ceph-mon[80754]: 7.9 scrub ok
Nov 29 06:21:31 compute-1 ceph-mon[80754]: pgmap v197: 305 pgs: 40 peering, 265 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 145 B/s, 0 objects/s recovering
Nov 29 06:21:32 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Nov 29 06:21:32 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Nov 29 06:21:32 compute-1 ceph-mon[80754]: 7.10 scrub starts
Nov 29 06:21:32 compute-1 ceph-mon[80754]: 7.10 scrub ok
Nov 29 06:21:32 compute-1 ceph-mon[80754]: pgmap v198: 305 pgs: 31 peering, 274 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 111 B/s, 0 objects/s recovering
Nov 29 06:21:32 compute-1 ceph-mon[80754]: 4.1b scrub starts
Nov 29 06:21:32 compute-1 ceph-mon[80754]: 4.1b scrub ok
Nov 29 06:21:32 compute-1 ceph-mon[80754]: 5.8 scrub starts
Nov 29 06:21:32 compute-1 ceph-mon[80754]: 5.8 scrub ok
Nov 29 06:21:33 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Nov 29 06:21:33 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Nov 29 06:21:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:21:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:33.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:21:34 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:34 compute-1 ceph-mon[80754]: 7.b scrub starts
Nov 29 06:21:34 compute-1 ceph-mon[80754]: 7.b scrub ok
Nov 29 06:21:34 compute-1 ceph-mon[80754]: 4.1a scrub starts
Nov 29 06:21:34 compute-1 ceph-mon[80754]: 4.1a scrub ok
Nov 29 06:21:35 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Nov 29 06:21:35 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Nov 29 06:21:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Nov 29 06:21:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:35.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:36 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.e scrub starts
Nov 29 06:21:36 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.e scrub ok
Nov 29 06:21:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:36.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:37 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.3 deep-scrub starts
Nov 29 06:21:37 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.3 deep-scrub ok
Nov 29 06:21:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:37.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:37 compute-1 sudo[84769]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:38.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:39.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:39 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:39 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Nov 29 06:21:39 compute-1 ceph-mon[80754]: pgmap v199: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 118 B/s, 0 objects/s recovering
Nov 29 06:21:40 compute-1 ceph-mon[80754]: 3.15 scrub starts
Nov 29 06:21:40 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Nov 29 06:21:40 compute-1 ceph-mon[80754]: 3.15 scrub ok
Nov 29 06:21:40 compute-1 ceph-mon[80754]: 6.5 scrub starts
Nov 29 06:21:40 compute-1 ceph-mon[80754]: 6.5 scrub ok
Nov 29 06:21:40 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Nov 29 06:21:40 compute-1 ceph-mon[80754]: osdmap e65: 3 total, 3 up, 3 in
Nov 29 06:21:40 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Nov 29 06:21:40 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Nov 29 06:21:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:40.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:41.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:41 compute-1 sshd-session[84407]: Connection closed by 192.168.122.30 port 44948
Nov 29 06:21:41 compute-1 sshd-session[84404]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:21:41 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Nov 29 06:21:41 compute-1 systemd[1]: session-33.scope: Consumed 9.823s CPU time.
Nov 29 06:21:41 compute-1 systemd-logind[785]: Session 33 logged out. Waiting for processes to exit.
Nov 29 06:21:41 compute-1 systemd-logind[785]: Removed session 33.
Nov 29 06:21:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:42.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:43.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:44 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.d scrub starts
Nov 29 06:21:44 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.d scrub ok
Nov 29 06:21:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:44.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:44 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Nov 29 06:21:44 compute-1 ceph-mon[80754]: pgmap v201: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 13 B/s, 0 objects/s recovering
Nov 29 06:21:44 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 06:21:44 compute-1 ceph-mon[80754]: 4.e scrub starts
Nov 29 06:21:44 compute-1 ceph-mon[80754]: 4.e scrub ok
Nov 29 06:21:44 compute-1 ceph-mon[80754]: 6.3 deep-scrub starts
Nov 29 06:21:44 compute-1 ceph-mon[80754]: 6.3 deep-scrub ok
Nov 29 06:21:44 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:44 compute-1 ceph-mon[80754]: pgmap v202: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 11 B/s, 0 objects/s recovering
Nov 29 06:21:44 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 06:21:44 compute-1 ceph-mon[80754]: 7.f scrub starts
Nov 29 06:21:44 compute-1 ceph-mon[80754]: 7.f scrub ok
Nov 29 06:21:44 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 06:21:44 compute-1 ceph-mon[80754]: pgmap v203: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:21:44 compute-1 ceph-mon[80754]: 6.7 scrub starts
Nov 29 06:21:44 compute-1 ceph-mon[80754]: 6.7 scrub ok
Nov 29 06:21:44 compute-1 ceph-mon[80754]: osdmap e66: 3 total, 3 up, 3 in
Nov 29 06:21:44 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 06:21:44 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:45 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Nov 29 06:21:45 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Nov 29 06:21:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:45.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:46.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:47 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Nov 29 06:21:47 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Nov 29 06:21:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:47.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:48.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:48 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Nov 29 06:21:48 compute-1 ceph-mon[80754]: 5.13 scrub starts
Nov 29 06:21:48 compute-1 ceph-mon[80754]: pgmap v205: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Nov 29 06:21:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:48 compute-1 ceph-mon[80754]: 7.8 scrub starts
Nov 29 06:21:48 compute-1 ceph-mon[80754]: 7.8 scrub ok
Nov 29 06:21:48 compute-1 ceph-mon[80754]: pgmap v206: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:48 compute-1 ceph-mon[80754]: 4.d scrub starts
Nov 29 06:21:48 compute-1 ceph-mon[80754]: 4.d scrub ok
Nov 29 06:21:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 06:21:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 06:21:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:48 compute-1 ceph-mon[80754]: osdmap e67: 3 total, 3 up, 3 in
Nov 29 06:21:48 compute-1 ceph-mon[80754]: 3.16 scrub starts
Nov 29 06:21:48 compute-1 ceph-mon[80754]: 3.16 scrub ok
Nov 29 06:21:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:49.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:50.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Nov 29 06:21:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Nov 29 06:21:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:51.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Nov 29 06:21:51 compute-1 ceph-mon[80754]: 5.13 scrub ok
Nov 29 06:21:51 compute-1 ceph-mon[80754]: 7.1e scrub starts
Nov 29 06:21:51 compute-1 ceph-mon[80754]: pgmap v208: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:51 compute-1 ceph-mon[80754]: 7.1e scrub ok
Nov 29 06:21:51 compute-1 ceph-mon[80754]: 7.1b scrub starts
Nov 29 06:21:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:51 compute-1 ceph-mon[80754]: 5.10 scrub starts
Nov 29 06:21:51 compute-1 ceph-mon[80754]: 5.10 scrub ok
Nov 29 06:21:51 compute-1 ceph-mon[80754]: 7.1b scrub ok
Nov 29 06:21:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Nov 29 06:21:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:52.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:53 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Nov 29 06:21:53 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Nov 29 06:21:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:53.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:21:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:54.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:21:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 3.e scrub starts
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 3.e scrub ok
Nov 29 06:21:54 compute-1 ceph-mon[80754]: pgmap v209: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 5.1a scrub starts
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 5.1a scrub ok
Nov 29 06:21:54 compute-1 ceph-mon[80754]: osdmap e68: 3 total, 3 up, 3 in
Nov 29 06:21:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 29 06:21:54 compute-1 ceph-mon[80754]: Deploying daemon keepalived.rgw.default.compute-2.klqjoa on compute-2
Nov 29 06:21:54 compute-1 ceph-mon[80754]: pgmap v211: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 3.1d scrub starts
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 3.1d scrub ok
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 7.2 scrub starts
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 7.2 scrub ok
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 5.11 scrub starts
Nov 29 06:21:54 compute-1 ceph-mon[80754]: 5.11 scrub ok
Nov 29 06:21:54 compute-1 ceph-mon[80754]: pgmap v212: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:54 compute-1 ceph-mon[80754]: osdmap e69: 3 total, 3 up, 3 in
Nov 29 06:21:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Nov 29 06:21:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:55.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:55 compute-1 ceph-mon[80754]: 3.14 scrub starts
Nov 29 06:21:55 compute-1 ceph-mon[80754]: 3.14 scrub ok
Nov 29 06:21:55 compute-1 ceph-mon[80754]: pgmap v214: 305 pgs: 8 unknown, 297 active+clean; 455 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:21:55 compute-1 ceph-mon[80754]: osdmap e70: 3 total, 3 up, 3 in
Nov 29 06:21:56 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Nov 29 06:21:56 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Nov 29 06:21:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:21:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:56.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:21:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Nov 29 06:21:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:21:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:57.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:21:57 compute-1 sshd-session[84828]: Accepted publickey for zuul from 192.168.122.30 port 49582 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:21:57 compute-1 systemd-logind[785]: New session 34 of user zuul.
Nov 29 06:21:57 compute-1 systemd[1]: Started Session 34 of User zuul.
Nov 29 06:21:57 compute-1 sshd-session[84828]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:21:58 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Nov 29 06:21:58 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Nov 29 06:21:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:58.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:58 compute-1 ceph-mon[80754]: 5.1c scrub starts
Nov 29 06:21:58 compute-1 ceph-mon[80754]: 5.1c scrub ok
Nov 29 06:21:58 compute-1 ceph-mon[80754]: pgmap v216: 305 pgs: 6 active+remapped, 1 active+recovering+remapped, 1 unknown, 297 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 49 KiB/s rd, 984 B/s wr, 87 op/s; 6/210 objects misplaced (2.857%); 120 B/s, 4 objects/s recovering
Nov 29 06:21:58 compute-1 ceph-mon[80754]: osdmap e71: 3 total, 3 up, 3 in
Nov 29 06:21:58 compute-1 python3.9[84981]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 29 06:21:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Nov 29 06:21:59 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Nov 29 06:21:59 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Nov 29 06:21:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:21:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:21:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:59.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:21:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:21:59 compute-1 python3.9[85155]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:21:59 compute-1 ceph-mon[80754]: 3.1c scrub starts
Nov 29 06:21:59 compute-1 ceph-mon[80754]: pgmap v218: 305 pgs: 6 active+remapped, 1 active+recovering+remapped, 1 unknown, 297 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 59 KiB/s rd, 1.2 KiB/s wr, 106 op/s; 6/210 objects misplaced (2.857%); 146 B/s, 4 objects/s recovering
Nov 29 06:21:59 compute-1 ceph-mon[80754]: 3.1c scrub ok
Nov 29 06:21:59 compute-1 ceph-mon[80754]: 3.1b scrub starts
Nov 29 06:21:59 compute-1 ceph-mon[80754]: 3.1b scrub ok
Nov 29 06:21:59 compute-1 ceph-mon[80754]: osdmap e72: 3 total, 3 up, 3 in
Nov 29 06:21:59 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Nov 29 06:21:59 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Nov 29 06:22:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:00.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:00 compute-1 sudo[85309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rexjggfjdcafwkdqzlrkgjrruocottkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397320.5010133-99-185839542644261/AnsiballZ_command.py'
Nov 29 06:22:00 compute-1 sudo[85309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:00 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Nov 29 06:22:00 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Nov 29 06:22:01 compute-1 python3.9[85311]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:22:01 compute-1 sudo[85309]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:01.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:01 compute-1 ceph-mon[80754]: 3.12 scrub starts
Nov 29 06:22:01 compute-1 ceph-mon[80754]: 3.12 scrub ok
Nov 29 06:22:01 compute-1 ceph-mon[80754]: 5.7 scrub starts
Nov 29 06:22:01 compute-1 ceph-mon[80754]: 5.7 scrub ok
Nov 29 06:22:01 compute-1 ceph-mon[80754]: 3.13 scrub starts
Nov 29 06:22:01 compute-1 ceph-mon[80754]: 3.13 scrub ok
Nov 29 06:22:01 compute-1 ceph-mon[80754]: pgmap v220: 305 pgs: 7 peering, 298 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 67 KiB/s rd, 1.3 KiB/s wr, 121 op/s; 0 B/s, 0 objects/s recovering
Nov 29 06:22:01 compute-1 ceph-mon[80754]: 4.19 scrub starts
Nov 29 06:22:01 compute-1 ceph-mon[80754]: 4.19 scrub ok
Nov 29 06:22:02 compute-1 sudo[85462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxfwveiipoxeqovufimdzydxnvjxcfax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397321.63629-135-170237918720240/AnsiballZ_stat.py'
Nov 29 06:22:02 compute-1 sudo[85462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:02.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:02 compute-1 python3.9[85464]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:22:02 compute-1 sudo[85462]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:03 compute-1 ceph-mon[80754]: 3.17 scrub starts
Nov 29 06:22:03 compute-1 ceph-mon[80754]: 3.17 scrub ok
Nov 29 06:22:03 compute-1 ceph-mon[80754]: 5.1b scrub starts
Nov 29 06:22:03 compute-1 ceph-mon[80754]: 5.1b scrub ok
Nov 29 06:22:03 compute-1 ceph-mon[80754]: 3.18 scrub starts
Nov 29 06:22:03 compute-1 ceph-mon[80754]: 3.18 scrub ok
Nov 29 06:22:03 compute-1 ceph-mon[80754]: pgmap v221: 305 pgs: 7 peering, 298 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 60 KiB/s rd, 1.2 KiB/s wr, 108 op/s; 0 B/s, 0 objects/s recovering
Nov 29 06:22:03 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.f scrub starts
Nov 29 06:22:03 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.f scrub ok
Nov 29 06:22:03 compute-1 sudo[85616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buohpmajthesigpvqmkvleodyncnyfve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397322.7205038-168-231459358554480/AnsiballZ_file.py'
Nov 29 06:22:03 compute-1 sudo[85616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:03 compute-1 python3.9[85618]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:22:03 compute-1 sudo[85616]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:03.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:04.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:04 compute-1 sudo[85768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwwzhybkkwsdfxlwkprxqhtnsioqkxmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397324.0210662-195-21613706663744/AnsiballZ_file.py'
Nov 29 06:22:04 compute-1 sudo[85768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:04 compute-1 python3.9[85770]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:22:04 compute-1 sudo[85768]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:04 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Nov 29 06:22:05 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Nov 29 06:22:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:05.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:05 compute-1 ceph-mon[80754]: 3.1 scrub starts
Nov 29 06:22:05 compute-1 ceph-mon[80754]: 3.1 scrub ok
Nov 29 06:22:05 compute-1 ceph-mon[80754]: 5.f scrub starts
Nov 29 06:22:05 compute-1 ceph-mon[80754]: 5.f scrub ok
Nov 29 06:22:05 compute-1 ceph-mon[80754]: 7.14 scrub starts
Nov 29 06:22:05 compute-1 ceph-mon[80754]: 7.14 scrub ok
Nov 29 06:22:06 compute-1 python3.9[85920]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:22:06 compute-1 network[85937]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:22:06 compute-1 network[85938]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:22:06 compute-1 network[85939]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:22:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:06.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Nov 29 06:22:06 compute-1 ceph-mon[80754]: 5.14 scrub starts
Nov 29 06:22:06 compute-1 ceph-mon[80754]: 5.14 scrub ok
Nov 29 06:22:06 compute-1 ceph-mon[80754]: pgmap v222: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 6.2 KiB/s rd, 127 B/s wr, 11 op/s; 41 B/s, 1 objects/s recovering
Nov 29 06:22:06 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Nov 29 06:22:06 compute-1 ceph-mon[80754]: 5.1f scrub starts
Nov 29 06:22:06 compute-1 ceph-mon[80754]: 5.1f scrub ok
Nov 29 06:22:06 compute-1 ceph-mon[80754]: 7.1d deep-scrub starts
Nov 29 06:22:06 compute-1 ceph-mon[80754]: 7.1d deep-scrub ok
Nov 29 06:22:06 compute-1 ceph-mon[80754]: pgmap v223: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 5.4 KiB/s rd, 110 B/s wr, 9 op/s; 35 B/s, 0 objects/s recovering
Nov 29 06:22:06 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Nov 29 06:22:06 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Nov 29 06:22:06 compute-1 ceph-mon[80754]: osdmap e73: 3 total, 3 up, 3 in
Nov 29 06:22:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:07.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:07 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Nov 29 06:22:08 compute-1 ceph-mon[80754]: 5.17 scrub starts
Nov 29 06:22:08 compute-1 ceph-mon[80754]: 5.17 scrub ok
Nov 29 06:22:08 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:08 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:08 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Nov 29 06:22:08 compute-1 ceph-mon[80754]: osdmap e74: 3 total, 3 up, 3 in
Nov 29 06:22:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:08.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:09 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Nov 29 06:22:09 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 75 pg[9.1e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=75) [0] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:09 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 75 pg[9.6( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=75) [0] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:09 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 75 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=75) [0] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:09 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 75 pg[9.e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=75) [0] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:09 compute-1 ceph-mon[80754]: 5.1e scrub starts
Nov 29 06:22:09 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:09 compute-1 ceph-mon[80754]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 29 06:22:09 compute-1 ceph-mon[80754]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 29 06:22:09 compute-1 ceph-mon[80754]: Deploying daemon keepalived.rgw.default.compute-0.uyqrbs on compute-0
Nov 29 06:22:09 compute-1 ceph-mon[80754]: 5.1e scrub ok
Nov 29 06:22:09 compute-1 ceph-mon[80754]: pgmap v226: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 151 B/s, 4 objects/s recovering
Nov 29 06:22:09 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Nov 29 06:22:09 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Nov 29 06:22:09 compute-1 ceph-mon[80754]: osdmap e75: 3 total, 3 up, 3 in
Nov 29 06:22:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:09.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:09 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:10.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:10 compute-1 python3.9[86199]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:22:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Nov 29 06:22:10 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:10 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:10 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:10 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.6( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:10 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:10 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.6( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:10 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.1e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:10 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.1e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:11.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:11 compute-1 python3.9[86349]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:22:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:11 compute-1 ceph-mon[80754]: pgmap v228: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:22:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Nov 29 06:22:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:12.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Nov 29 06:22:13 compute-1 python3.9[86503]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:22:13 compute-1 ceph-mon[80754]: osdmap e76: 3 total, 3 up, 3 in
Nov 29 06:22:13 compute-1 ceph-mon[80754]: 5.1d scrub starts
Nov 29 06:22:13 compute-1 ceph-mon[80754]: 5.1d scrub ok
Nov 29 06:22:13 compute-1 ceph-mon[80754]: pgmap v230: 305 pgs: 1 active+recovering+remapped, 1 active+remapped, 2 active+recovery_wait+remapped, 301 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 15/215 objects misplaced (6.977%); 38 B/s, 1 objects/s recovering
Nov 29 06:22:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Nov 29 06:22:13 compute-1 ceph-mon[80754]: osdmap e77: 3 total, 3 up, 3 in
Nov 29 06:22:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:13.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:13 compute-1 sshd-session[86537]: Received disconnect from 45.55.249.98 port 58492:11: Bye Bye [preauth]
Nov 29 06:22:13 compute-1 sshd-session[86537]: Disconnected from authenticating user root 45.55.249.98 port 58492 [preauth]
Nov 29 06:22:14 compute-1 sudo[86661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsdwcbsybrvdcssuuwwpedfprdoveevd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397333.6354516-339-108113619679504/AnsiballZ_setup.py'
Nov 29 06:22:14 compute-1 sudo[86661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:14 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Nov 29 06:22:14 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Nov 29 06:22:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:14.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:14 compute-1 python3.9[86663]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:22:14 compute-1 sudo[86661]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:14 compute-1 sudo[86745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydltrtmekxkajiduodvezhueofgpymsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397333.6354516-339-108113619679504/AnsiballZ_dnf.py'
Nov 29 06:22:14 compute-1 sudo[86745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:15 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Nov 29 06:22:15 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Nov 29 06:22:15 compute-1 python3.9[86747]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:22:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:15.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:16 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Nov 29 06:22:16 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Nov 29 06:22:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:16.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Nov 29 06:22:16 compute-1 ceph-mon[80754]: 5.15 scrub starts
Nov 29 06:22:16 compute-1 ceph-mon[80754]: 5.15 scrub ok
Nov 29 06:22:16 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.e( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:16 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:16 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:16 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.e( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:16 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:16 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:16 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.6( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:16 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.6( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:17 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Nov 29 06:22:17 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Nov 29 06:22:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:17.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:18 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 3.19 scrub starts
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 3.19 scrub ok
Nov 29 06:22:18 compute-1 ceph-mon[80754]: pgmap v232: 305 pgs: 1 active+recovering+remapped, 1 active+remapped, 2 active+recovery_wait+remapped, 301 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 15/215 objects misplaced (6.977%); 36 B/s, 1 objects/s recovering
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 7.5 scrub starts
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 7.5 scrub ok
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 5.18 scrub starts
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 5.18 scrub ok
Nov 29 06:22:18 compute-1 ceph-mon[80754]: pgmap v233: 305 pgs: 1 active+recovering+remapped, 5 active+remapped, 2 active+recovery_wait+remapped, 297 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 15/215 objects misplaced (6.977%); 63 B/s, 4 objects/s recovering
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 5.1 scrub starts
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 5.1 scrub ok
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 7.a scrub starts
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 7.a scrub ok
Nov 29 06:22:18 compute-1 ceph-mon[80754]: osdmap e78: 3 total, 3 up, 3 in
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 3.5 scrub starts
Nov 29 06:22:18 compute-1 ceph-mon[80754]: 3.5 scrub ok
Nov 29 06:22:18 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:18.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:18 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 79 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:18 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 79 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:18 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 79 pg[9.6( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:18 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 79 pg[9.e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:18 compute-1 sudo[86811]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:18 compute-1 sudo[86811]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:18 compute-1 sudo[86811]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:18 compute-1 sudo[86836]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:22:18 compute-1 sudo[86836]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:18 compute-1 sudo[86836]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:18 compute-1 sudo[86861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:18 compute-1 sudo[86861]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:18 compute-1 sudo[86861]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:19 compute-1 sudo[86886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:22:19 compute-1 sudo[86886]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:19 compute-1 sudo[86886]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:19 compute-1 sudo[86911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:19 compute-1 sudo[86911]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:19 compute-1 sudo[86911]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:19 compute-1 sudo[86936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:22:19 compute-1 sudo[86936]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:19.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:19 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:19 compute-1 podman[87036]: 2025-11-29 06:22:19.835354953 +0000 UTC m=+0.100195680 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Nov 29 06:22:19 compute-1 podman[87036]: 2025-11-29 06:22:19.968773819 +0000 UTC m=+0.233614526 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 06:22:20 compute-1 sshd-session[86961]: Invalid user aa from 118.194.230.250 port 50362
Nov 29 06:22:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:20.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:20 compute-1 sshd-session[86961]: Received disconnect from 118.194.230.250 port 50362:11: Bye Bye [preauth]
Nov 29 06:22:20 compute-1 sshd-session[86961]: Disconnected from invalid user aa 118.194.230.250 port 50362 [preauth]
Nov 29 06:22:20 compute-1 sudo[86936]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:21.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:22.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:23 compute-1 ceph-mon[80754]: 7.16 scrub starts
Nov 29 06:22:23 compute-1 ceph-mon[80754]: 7.16 scrub ok
Nov 29 06:22:23 compute-1 ceph-mon[80754]: 5.5 scrub starts
Nov 29 06:22:23 compute-1 ceph-mon[80754]: 5.5 scrub ok
Nov 29 06:22:23 compute-1 ceph-mon[80754]: pgmap v235: 305 pgs: 1 active+recovering+remapped, 5 active+remapped, 2 active+recovery_wait+remapped, 297 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 15/215 objects misplaced (6.977%); 30 B/s, 2 objects/s recovering
Nov 29 06:22:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:23 compute-1 ceph-mon[80754]: osdmap e79: 3 total, 3 up, 3 in
Nov 29 06:22:23 compute-1 ceph-mon[80754]: 7.1f scrub starts
Nov 29 06:22:23 compute-1 ceph-mon[80754]: 7.1f scrub ok
Nov 29 06:22:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:23.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 3.4 scrub starts
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 7.11 scrub starts
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 7.11 scrub ok
Nov 29 06:22:24 compute-1 ceph-mon[80754]: pgmap v237: 305 pgs: 4 peering, 4 active+remapped, 297 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 78 B/s, 4 objects/s recovering
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 3.4 scrub ok
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 3.1e scrub starts
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 3.1e scrub ok
Nov 29 06:22:24 compute-1 ceph-mon[80754]: pgmap v238: 305 pgs: 4 peering, 301 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 74 B/s, 4 objects/s recovering
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 6.1 scrub starts
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 6.1 scrub ok
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 3.7 scrub starts
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 3.7 scrub ok
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 4.14 scrub starts
Nov 29 06:22:24 compute-1 ceph-mon[80754]: 4.14 scrub ok
Nov 29 06:22:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:24 compute-1 sudo[87164]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:24 compute-1 sudo[87164]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:24 compute-1 sudo[87164]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:24 compute-1 sudo[87189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:22:24 compute-1 sudo[87189]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:24 compute-1 sudo[87189]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:24 compute-1 sudo[87214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:24 compute-1 sudo[87214]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:24 compute-1 sudo[87214]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:24.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:24 compute-1 sudo[87239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:22:24 compute-1 sudo[87239]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:24 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:24 compute-1 sudo[87239]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:25 compute-1 sshd-session[87291]: Invalid user precio01 from 71.70.164.48 port 49955
Nov 29 06:22:25 compute-1 ceph-mon[80754]: pgmap v239: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 46 B/s, 1 objects/s recovering
Nov 29 06:22:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Nov 29 06:22:25 compute-1 ceph-mon[80754]: 4.1d scrub starts
Nov 29 06:22:25 compute-1 ceph-mon[80754]: 4.1d scrub ok
Nov 29 06:22:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:25 compute-1 sshd-session[87291]: Received disconnect from 71.70.164.48 port 49955:11: Bye Bye [preauth]
Nov 29 06:22:25 compute-1 sshd-session[87291]: Disconnected from invalid user precio01 71.70.164.48 port 49955 [preauth]
Nov 29 06:22:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Nov 29 06:22:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:25.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:26.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:22:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:22:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Nov 29 06:22:26 compute-1 ceph-mon[80754]: osdmap e80: 3 total, 3 up, 3 in
Nov 29 06:22:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:22:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:22:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:22:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Nov 29 06:22:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Nov 29 06:22:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:27.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:27 compute-1 ceph-mon[80754]: pgmap v241: 305 pgs: 305 active+clean; 456 KiB data, 103 MiB used, 21 GiB / 21 GiB avail; 46 B/s, 1 objects/s recovering
Nov 29 06:22:27 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Nov 29 06:22:27 compute-1 ceph-mon[80754]: osdmap e81: 3 total, 3 up, 3 in
Nov 29 06:22:28 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.d scrub starts
Nov 29 06:22:28 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.d scrub ok
Nov 29 06:22:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:28.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:28 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Nov 29 06:22:28 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 82 pg[9.a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [0] r=0 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:28 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 82 pg[9.1a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [0] r=0 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:28 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Nov 29 06:22:28 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 83 pg[9.a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=83) [0]/[1] r=-1 lpr=83 pi=[58,83)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:28 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 83 pg[9.a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=83) [0]/[1] r=-1 lpr=83 pi=[58,83)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:28 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 83 pg[9.1a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=83) [0]/[1] r=-1 lpr=83 pi=[58,83)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:28 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 83 pg[9.1a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=83) [0]/[1] r=-1 lpr=83 pi=[58,83)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:29 compute-1 ceph-mon[80754]: pgmap v243: 305 pgs: 305 active+clean; 456 KiB data, 104 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:22:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Nov 29 06:22:29 compute-1 ceph-mon[80754]: 3.d scrub starts
Nov 29 06:22:29 compute-1 ceph-mon[80754]: 3.d scrub ok
Nov 29 06:22:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Nov 29 06:22:29 compute-1 ceph-mon[80754]: osdmap e82: 3 total, 3 up, 3 in
Nov 29 06:22:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:29.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 06:22:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:30.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 06:22:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Nov 29 06:22:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:31.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:32.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:33.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:34.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:34 compute-1 ceph-mon[80754]: osdmap e83: 3 total, 3 up, 3 in
Nov 29 06:22:34 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:35.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:36.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:37 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Nov 29 06:22:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:22:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:37.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:22:38 compute-1 ceph-mon[80754]: pgmap v246: 305 pgs: 4 unknown, 301 active+clean; 456 KiB data, 104 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:22:38 compute-1 ceph-mon[80754]: osdmap e84: 3 total, 3 up, 3 in
Nov 29 06:22:38 compute-1 ceph-mon[80754]: pgmap v248: 305 pgs: 4 active+remapped, 2 remapped+peering, 299 active+clean; 456 KiB data, 104 MiB used, 21 GiB / 21 GiB avail; 112 B/s, 4 objects/s recovering
Nov 29 06:22:38 compute-1 ceph-mon[80754]: 3.6 scrub starts
Nov 29 06:22:38 compute-1 ceph-mon[80754]: pgmap v249: 305 pgs: 2 peering, 2 active+remapped, 2 remapped+peering, 299 active+clean; 456 KiB data, 104 MiB used, 21 GiB / 21 GiB avail; 36 B/s, 1 objects/s recovering
Nov 29 06:22:38 compute-1 ceph-mon[80754]: 5.3 scrub starts
Nov 29 06:22:38 compute-1 ceph-mon[80754]: 3.6 scrub ok
Nov 29 06:22:38 compute-1 ceph-mon[80754]: 5.3 scrub ok
Nov 29 06:22:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:38.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:39.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:39 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:40.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Nov 29 06:22:41 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 86 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:41 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 86 pg[9.a( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:41 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 86 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:41 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 86 pg[9.a( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:41.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:42.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:42 compute-1 ceph-mon[80754]: 4.15 scrub starts
Nov 29 06:22:42 compute-1 ceph-mon[80754]: 4.15 scrub ok
Nov 29 06:22:42 compute-1 ceph-mon[80754]: 4.1f scrub starts
Nov 29 06:22:42 compute-1 ceph-mon[80754]: 4.1f scrub ok
Nov 29 06:22:42 compute-1 ceph-mon[80754]: 4.1c scrub starts
Nov 29 06:22:42 compute-1 ceph-mon[80754]: 4.1c scrub ok
Nov 29 06:22:42 compute-1 ceph-mon[80754]: pgmap v250: 305 pgs: 2 peering, 4 active+remapped, 299 active+clean; 456 KiB data, 104 MiB used, 21 GiB / 21 GiB avail; 26 KiB/s rd, 530 B/s wr, 47 op/s; 56 B/s, 3 objects/s recovering
Nov 29 06:22:42 compute-1 ceph-mon[80754]: osdmap e85: 3 total, 3 up, 3 in
Nov 29 06:22:42 compute-1 ceph-mon[80754]: 10.4 scrub starts
Nov 29 06:22:42 compute-1 ceph-mon[80754]: 10.4 scrub ok
Nov 29 06:22:42 compute-1 ceph-mon[80754]: pgmap v252: 305 pgs: 2 peering, 4 active+remapped, 299 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 25 KiB/s rd, 511 B/s wr, 45 op/s; 54 B/s, 2 objects/s recovering
Nov 29 06:22:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:22:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:43.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:22:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:22:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:44.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:22:44 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Nov 29 06:22:44 compute-1 ceph-mon[80754]: 11.3 scrub starts
Nov 29 06:22:44 compute-1 ceph-mon[80754]: 11.3 scrub ok
Nov 29 06:22:44 compute-1 ceph-mon[80754]: 8.6 scrub starts
Nov 29 06:22:44 compute-1 ceph-mon[80754]: 8.6 scrub ok
Nov 29 06:22:44 compute-1 ceph-mon[80754]: pgmap v253: 305 pgs: 2 peering, 2 active+remapped, 301 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 0 B/s, 0 objects/s recovering
Nov 29 06:22:44 compute-1 ceph-mon[80754]: 5.6 scrub starts
Nov 29 06:22:44 compute-1 ceph-mon[80754]: 5.6 scrub ok
Nov 29 06:22:44 compute-1 ceph-mon[80754]: osdmap e86: 3 total, 3 up, 3 in
Nov 29 06:22:44 compute-1 ceph-mon[80754]: 10.f deep-scrub starts
Nov 29 06:22:44 compute-1 ceph-mon[80754]: 10.f deep-scrub ok
Nov 29 06:22:44 compute-1 ceph-mon[80754]: pgmap v255: 305 pgs: 4 peering, 301 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 41 B/s, 1 objects/s recovering
Nov 29 06:22:44 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 87 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=5 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:44 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 87 pg[9.a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=6 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:22:44 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:44 compute-1 sudo[87371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:22:44 compute-1 sudo[87371]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:44 compute-1 sudo[87371]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:44 compute-1 sudo[87396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:22:44 compute-1 sudo[87396]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:22:44 compute-1 sudo[87396]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:22:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:45.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:22:46 compute-1 ceph-mon[80754]: 3.2 scrub starts
Nov 29 06:22:46 compute-1 ceph-mon[80754]: 3.2 scrub ok
Nov 29 06:22:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:46 compute-1 ceph-mon[80754]: pgmap v256: 305 pgs: 2 peering, 303 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 41 B/s, 1 objects/s recovering
Nov 29 06:22:46 compute-1 ceph-mon[80754]: osdmap e87: 3 total, 3 up, 3 in
Nov 29 06:22:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 06:22:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Nov 29 06:22:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:22:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:46.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:47 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.c scrub starts
Nov 29 06:22:47 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.c scrub ok
Nov 29 06:22:47 compute-1 ceph-mon[80754]: Reconfiguring mon.compute-0 (monmap changed)...
Nov 29 06:22:47 compute-1 ceph-mon[80754]: Reconfiguring daemon mon.compute-0 on compute-0
Nov 29 06:22:47 compute-1 ceph-mon[80754]: pgmap v258: 305 pgs: 2 peering, 303 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 41 B/s, 1 objects/s recovering
Nov 29 06:22:47 compute-1 ceph-mon[80754]: 11.8 scrub starts
Nov 29 06:22:47 compute-1 ceph-mon[80754]: 11.8 scrub ok
Nov 29 06:22:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.vxabpq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 29 06:22:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:22:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:22:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:22:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:47.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:22:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:48.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:49 compute-1 ceph-mon[80754]: 3.c scrub starts
Nov 29 06:22:49 compute-1 ceph-mon[80754]: 3.c scrub ok
Nov 29 06:22:49 compute-1 ceph-mon[80754]: Reconfiguring mgr.compute-0.vxabpq (monmap changed)...
Nov 29 06:22:49 compute-1 ceph-mon[80754]: Reconfiguring daemon mgr.compute-0.vxabpq on compute-0
Nov 29 06:22:49 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Nov 29 06:22:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:49.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Nov 29 06:22:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:50.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:50 compute-1 ceph-mon[80754]: pgmap v259: 305 pgs: 305 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 27 B/s, 1 objects/s recovering
Nov 29 06:22:50 compute-1 ceph-mon[80754]: 5.19 scrub starts
Nov 29 06:22:50 compute-1 ceph-mon[80754]: 5.19 scrub ok
Nov 29 06:22:50 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Nov 29 06:22:50 compute-1 ceph-mon[80754]: osdmap e88: 3 total, 3 up, 3 in
Nov 29 06:22:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:51.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Nov 29 06:22:52 compute-1 ceph-mon[80754]: pgmap v261: 305 pgs: 305 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail; 25 B/s, 1 objects/s recovering
Nov 29 06:22:52 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Nov 29 06:22:52 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:52 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:22:52 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 06:22:52 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:22:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:22:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:52.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:22:53 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Nov 29 06:22:53 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 90 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90) [0] r=0 lpr=90 pi=[77,90)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:53 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 90 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90) [0] r=0 lpr=90 pi=[77,90)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:22:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:53.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:54.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:55 compute-1 ceph-mon[80754]: Reconfiguring crash.compute-0 (monmap changed)...
Nov 29 06:22:55 compute-1 ceph-mon[80754]: 10.11 scrub starts
Nov 29 06:22:55 compute-1 ceph-mon[80754]: Reconfiguring daemon crash.compute-0 on compute-0
Nov 29 06:22:55 compute-1 ceph-mon[80754]: 10.11 scrub ok
Nov 29 06:22:55 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Nov 29 06:22:55 compute-1 ceph-mon[80754]: osdmap e89: 3 total, 3 up, 3 in
Nov 29 06:22:55 compute-1 ceph-mon[80754]: pgmap v263: 305 pgs: 305 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:22:55 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Nov 29 06:22:55 compute-1 ceph-mon[80754]: 8.b scrub starts
Nov 29 06:22:55 compute-1 ceph-mon[80754]: 8.b scrub ok
Nov 29 06:22:55 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Nov 29 06:22:55 compute-1 ceph-mon[80754]: osdmap e90: 3 total, 3 up, 3 in
Nov 29 06:22:55 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Nov 29 06:22:55 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Nov 29 06:22:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:55.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:22:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:56.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:22:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Nov 29 06:22:57 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 91 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=-1 lpr=91 pi=[77,91)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:57 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 91 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=-1 lpr=91 pi=[77,91)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:57 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 91 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=-1 lpr=91 pi=[77,91)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:22:57 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 91 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=-1 lpr=91 pi=[77,91)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:22:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:57.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:57 compute-1 ceph-mon[80754]: 11.16 scrub starts
Nov 29 06:22:57 compute-1 ceph-mon[80754]: 11.16 scrub ok
Nov 29 06:22:57 compute-1 ceph-mon[80754]: pgmap v265: 305 pgs: 305 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:22:57 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Nov 29 06:22:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:58.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:22:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:22:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:59.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:22:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:22:59 compute-1 sshd-session[87421]: error: kex_exchange_identification: read: Connection timed out
Nov 29 06:22:59 compute-1 sshd-session[87421]: banner exchange: Connection from 119.45.242.7 port 39270: Connection timed out
Nov 29 06:23:00 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Nov 29 06:23:00 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Nov 29 06:23:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:00.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:01.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:02.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Nov 29 06:23:03 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.f scrub starts
Nov 29 06:23:03 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.f scrub ok
Nov 29 06:23:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:23:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:03.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:23:03 compute-1 sudo[86745]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:04.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:04 compute-1 ceph-mon[80754]: 3.3 scrub starts
Nov 29 06:23:04 compute-1 ceph-mon[80754]: 3.3 scrub ok
Nov 29 06:23:04 compute-1 ceph-mon[80754]: 10.3 scrub starts
Nov 29 06:23:04 compute-1 ceph-mon[80754]: 5.a scrub starts
Nov 29 06:23:04 compute-1 ceph-mon[80754]: 5.a scrub ok
Nov 29 06:23:04 compute-1 ceph-mon[80754]: 10.3 scrub ok
Nov 29 06:23:04 compute-1 ceph-mon[80754]: pgmap v266: 305 pgs: 305 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:04 compute-1 ceph-mon[80754]: 10.10 scrub starts
Nov 29 06:23:04 compute-1 ceph-mon[80754]: 10.10 scrub ok
Nov 29 06:23:04 compute-1 ceph-mon[80754]: 3.b scrub starts
Nov 29 06:23:04 compute-1 ceph-mon[80754]: 3.b scrub ok
Nov 29 06:23:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Nov 29 06:23:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:04 compute-1 ceph-mon[80754]: osdmap e91: 3 total, 3 up, 3 in
Nov 29 06:23:04 compute-1 ceph-mon[80754]: pgmap v268: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:04 compute-1 ceph-mon[80754]: Reconfiguring osd.1 (monmap changed)...
Nov 29 06:23:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Nov 29 06:23:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:04 compute-1 ceph-mon[80754]: Reconfiguring daemon osd.1 on compute-0
Nov 29 06:23:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:05 compute-1 systemd[72642]: Created slice User Background Tasks Slice.
Nov 29 06:23:05 compute-1 systemd[72642]: Starting Cleanup of User's Temporary Files and Directories...
Nov 29 06:23:05 compute-1 systemd[72642]: Finished Cleanup of User's Temporary Files and Directories.
Nov 29 06:23:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:05.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:06 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.16 deep-scrub starts
Nov 29 06:23:06 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.16 deep-scrub ok
Nov 29 06:23:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:06.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Nov 29 06:23:06 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 93 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:06 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 93 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:06 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 93 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:06 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 93 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:07.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 10.1 scrub starts
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 10.1 scrub ok
Nov 29 06:23:07 compute-1 ceph-mon[80754]: pgmap v269: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 121 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 5.9 scrub starts
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 5.9 scrub ok
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 3.1f deep-scrub starts
Nov 29 06:23:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Nov 29 06:23:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 3.1f deep-scrub ok
Nov 29 06:23:07 compute-1 ceph-mon[80754]: osdmap e92: 3 total, 3 up, 3 in
Nov 29 06:23:07 compute-1 ceph-mon[80754]: pgmap v271: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 8.15 deep-scrub starts
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 8.15 deep-scrub ok
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 3.f scrub starts
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 3.f scrub ok
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 8.5 scrub starts
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 8.5 scrub ok
Nov 29 06:23:07 compute-1 ceph-mon[80754]: pgmap v272: 305 pgs: 2 active+remapped, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 41 B/s, 1 objects/s recovering
Nov 29 06:23:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 06:23:07 compute-1 ceph-mon[80754]: 8.1 scrub starts
Nov 29 06:23:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:08.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:09 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Nov 29 06:23:09 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Nov 29 06:23:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:09.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:09 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:10 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.6 scrub starts
Nov 29 06:23:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:10.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:10 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.6 scrub ok
Nov 29 06:23:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:23:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:11.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:23:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:12.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Nov 29 06:23:13 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 94 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94) [0] r=0 lpr=94 pi=[71,94)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:13 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 94 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94) [0] r=0 lpr=94 pi=[71,94)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:13 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 94 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=6 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:23:13 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 94 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=5 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:23:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:13.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:14 compute-1 ceph-mon[80754]: 8.1 scrub ok
Nov 29 06:23:14 compute-1 ceph-mon[80754]: 5.16 deep-scrub starts
Nov 29 06:23:14 compute-1 ceph-mon[80754]: 5.16 deep-scrub ok
Nov 29 06:23:14 compute-1 ceph-mon[80754]: pgmap v273: 305 pgs: 2 active+remapped, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 119 B/s wr, 20 op/s; 76 B/s, 2 objects/s recovering
Nov 29 06:23:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 06:23:14 compute-1 ceph-mon[80754]: osdmap e93: 3 total, 3 up, 3 in
Nov 29 06:23:14 compute-1 ceph-mon[80754]: 8.2 scrub starts
Nov 29 06:23:14 compute-1 ceph-mon[80754]: 8.2 scrub ok
Nov 29 06:23:14 compute-1 ceph-mon[80754]: 8.7 deep-scrub starts
Nov 29 06:23:14 compute-1 ceph-mon[80754]: 8.7 deep-scrub ok
Nov 29 06:23:14 compute-1 ceph-mon[80754]: pgmap v275: 305 pgs: 2 active+remapped, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 127 B/s wr, 22 op/s; 82 B/s, 3 objects/s recovering
Nov 29 06:23:14 compute-1 ceph-mon[80754]: 3.10 scrub starts
Nov 29 06:23:14 compute-1 ceph-mon[80754]: 8.16 scrub starts
Nov 29 06:23:14 compute-1 ceph-mon[80754]: 3.10 scrub ok
Nov 29 06:23:14 compute-1 ceph-mon[80754]: 8.16 scrub ok
Nov 29 06:23:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 06:23:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.004000104s ======
Nov 29 06:23:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:14.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000104s
Nov 29 06:23:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:23:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:15.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:23:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:16.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Nov 29 06:23:17 compute-1 ceph-mon[80754]: pgmap v276: 305 pgs: 2 active+remapped, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 0 B/s wr, 21 op/s; 80 B/s, 3 objects/s recovering
Nov 29 06:23:17 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 06:23:17 compute-1 ceph-mon[80754]: 10.6 scrub starts
Nov 29 06:23:17 compute-1 ceph-mon[80754]: 8.e scrub starts
Nov 29 06:23:17 compute-1 ceph-mon[80754]: 8.e scrub ok
Nov 29 06:23:17 compute-1 ceph-mon[80754]: 10.6 scrub ok
Nov 29 06:23:17 compute-1 ceph-mon[80754]: 8.13 scrub starts
Nov 29 06:23:17 compute-1 ceph-mon[80754]: pgmap v277: 305 pgs: 2 peering, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 9.6 KiB/s rd, 0 B/s wr, 17 op/s; 32 B/s, 1 objects/s recovering
Nov 29 06:23:17 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 06:23:17 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 06:23:17 compute-1 ceph-mon[80754]: 8.13 scrub ok
Nov 29 06:23:17 compute-1 ceph-mon[80754]: osdmap e94: 3 total, 3 up, 3 in
Nov 29 06:23:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:17.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:18 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.7 scrub starts
Nov 29 06:23:18 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.7 scrub ok
Nov 29 06:23:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:18.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:18 compute-1 sudo[87448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:18 compute-1 sudo[87448]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:18 compute-1 sudo[87448]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:18 compute-1 sudo[87473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:23:18 compute-1 sudo[87473]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:18 compute-1 sudo[87473]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:18 compute-1 sudo[87498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:18 compute-1 sudo[87498]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:18 compute-1 sudo[87498]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:18 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Nov 29 06:23:18 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 96 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=-1 lpr=96 pi=[71,96)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:18 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 96 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=-1 lpr=96 pi=[71,96)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:23:18 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 96 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=-1 lpr=96 pi=[71,96)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:18 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 96 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=-1 lpr=96 pi=[71,96)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:23:18 compute-1 sudo[87523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:23:18 compute-1 sudo[87523]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:19 compute-1 podman[87564]: 2025-11-29 06:23:19.121370023 +0000 UTC m=+0.066281664 container create 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:23:19 compute-1 systemd[1]: Started libpod-conmon-5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc.scope.
Nov 29 06:23:19 compute-1 podman[87564]: 2025-11-29 06:23:19.093807312 +0000 UTC m=+0.038719003 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:23:19 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:23:19 compute-1 podman[87564]: 2025-11-29 06:23:19.23243987 +0000 UTC m=+0.177351561 container init 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:23:19 compute-1 podman[87564]: 2025-11-29 06:23:19.244770482 +0000 UTC m=+0.189682123 container start 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Nov 29 06:23:19 compute-1 frosty_hermann[87580]: 167 167
Nov 29 06:23:19 compute-1 systemd[1]: libpod-5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc.scope: Deactivated successfully.
Nov 29 06:23:19 compute-1 conmon[87580]: conmon 5b512b841815487c5bcd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc.scope/container/memory.events
Nov 29 06:23:19 compute-1 ceph-mon[80754]: 8.9 deep-scrub starts
Nov 29 06:23:19 compute-1 ceph-mon[80754]: 8.9 deep-scrub ok
Nov 29 06:23:19 compute-1 ceph-mon[80754]: pgmap v279: 305 pgs: 2 peering, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:19 compute-1 ceph-mon[80754]: 8.1a deep-scrub starts
Nov 29 06:23:19 compute-1 ceph-mon[80754]: 8.1a deep-scrub ok
Nov 29 06:23:19 compute-1 ceph-mon[80754]: 11.a scrub starts
Nov 29 06:23:19 compute-1 ceph-mon[80754]: 11.a scrub ok
Nov 29 06:23:19 compute-1 ceph-mon[80754]: 8.a scrub starts
Nov 29 06:23:19 compute-1 ceph-mon[80754]: pgmap v280: 305 pgs: 2 peering, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:19 compute-1 ceph-mon[80754]: 8.a scrub ok
Nov 29 06:23:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 06:23:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 06:23:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:19 compute-1 ceph-mon[80754]: osdmap e95: 3 total, 3 up, 3 in
Nov 29 06:23:19 compute-1 ceph-mon[80754]: 10.7 scrub starts
Nov 29 06:23:19 compute-1 ceph-mon[80754]: 10.7 scrub ok
Nov 29 06:23:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 06:23:19 compute-1 podman[87564]: 2025-11-29 06:23:19.324977099 +0000 UTC m=+0.269888780 container attach 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3)
Nov 29 06:23:19 compute-1 podman[87564]: 2025-11-29 06:23:19.32576032 +0000 UTC m=+0.270671951 container died 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:23:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-79bf26297953ee68c2a2437eccb93f5b341d84aebb3b99a9b7a8a5df41d5c522-merged.mount: Deactivated successfully.
Nov 29 06:23:19 compute-1 podman[87564]: 2025-11-29 06:23:19.400211892 +0000 UTC m=+0.345123543 container remove 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Nov 29 06:23:19 compute-1 systemd[1]: libpod-conmon-5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc.scope: Deactivated successfully.
Nov 29 06:23:19 compute-1 sudo[87523]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:19.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:19 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:20 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.9 scrub starts
Nov 29 06:23:20 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.9 scrub ok
Nov 29 06:23:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:20.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:21 compute-1 sudo[87601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:21 compute-1 sudo[87601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:21 compute-1 sudo[87601]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:21 compute-1 sudo[87626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:23:21 compute-1 sudo[87626]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:21 compute-1 sudo[87626]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:21 compute-1 sudo[87651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:21 compute-1 sudo[87651]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:21 compute-1 sudo[87651]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:21 compute-1 sudo[87676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:23:21 compute-1 sudo[87676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Nov 29 06:23:21 compute-1 ceph-mon[80754]: pgmap v282: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:21 compute-1 ceph-mon[80754]: 8.f scrub starts
Nov 29 06:23:21 compute-1 ceph-mon[80754]: 8.f scrub ok
Nov 29 06:23:21 compute-1 ceph-mon[80754]: Reconfiguring crash.compute-1 (monmap changed)...
Nov 29 06:23:21 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:21 compute-1 ceph-mon[80754]: Reconfiguring daemon crash.compute-1 on compute-1
Nov 29 06:23:21 compute-1 ceph-mon[80754]: osdmap e96: 3 total, 3 up, 3 in
Nov 29 06:23:21 compute-1 ceph-mon[80754]: 8.d scrub starts
Nov 29 06:23:21 compute-1 ceph-mon[80754]: 8.d scrub ok
Nov 29 06:23:21 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:21 compute-1 podman[87718]: 2025-11-29 06:23:21.519983572 +0000 UTC m=+0.041705623 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:23:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:21.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:22.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:22 compute-1 podman[87718]: 2025-11-29 06:23:22.437938499 +0000 UTC m=+0.959660490 container create 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Nov 29 06:23:22 compute-1 systemd[1]: Started libpod-conmon-111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1.scope.
Nov 29 06:23:22 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:23:22 compute-1 sudo[87862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvrbqmerqabxqshzyxkhveoopotyldbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397401.8713772-375-227127807444152/AnsiballZ_command.py'
Nov 29 06:23:22 compute-1 podman[87718]: 2025-11-29 06:23:22.53762832 +0000 UTC m=+1.059350361 container init 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Nov 29 06:23:22 compute-1 sudo[87862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:22 compute-1 podman[87718]: 2025-11-29 06:23:22.550378614 +0000 UTC m=+1.072100585 container start 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 06:23:22 compute-1 zealous_curie[87853]: 167 167
Nov 29 06:23:22 compute-1 systemd[1]: libpod-111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1.scope: Deactivated successfully.
Nov 29 06:23:22 compute-1 podman[87718]: 2025-11-29 06:23:22.55545735 +0000 UTC m=+1.077179361 container attach 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Nov 29 06:23:22 compute-1 podman[87718]: 2025-11-29 06:23:22.556434366 +0000 UTC m=+1.078156377 container died 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:23:22 compute-1 systemd[1]: var-lib-containers-storage-overlay-0214373e0d899f4d6cc29ccd92286614bda1de2392412cf8e1699d3f67fc5d23-merged.mount: Deactivated successfully.
Nov 29 06:23:22 compute-1 podman[87718]: 2025-11-29 06:23:22.602969667 +0000 UTC m=+1.124691638 container remove 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Nov 29 06:23:22 compute-1 systemd[1]: libpod-conmon-111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1.scope: Deactivated successfully.
Nov 29 06:23:22 compute-1 sudo[87676]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:22 compute-1 python3.9[87865]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:23:23 compute-1 sudo[87862]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:23.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:23 compute-1 ceph-mon[80754]: 10.9 scrub starts
Nov 29 06:23:23 compute-1 ceph-mon[80754]: 10.9 scrub ok
Nov 29 06:23:23 compute-1 ceph-mon[80754]: pgmap v284: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 2 B/s, 0 objects/s recovering
Nov 29 06:23:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:23 compute-1 ceph-mon[80754]: Reconfiguring osd.0 (monmap changed)...
Nov 29 06:23:23 compute-1 ceph-mon[80754]: 8.1d scrub starts
Nov 29 06:23:23 compute-1 ceph-mon[80754]: 8.1d scrub ok
Nov 29 06:23:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Nov 29 06:23:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:23 compute-1 ceph-mon[80754]: Reconfiguring daemon osd.0 on compute-1
Nov 29 06:23:23 compute-1 ceph-mon[80754]: osdmap e97: 3 total, 3 up, 3 in
Nov 29 06:23:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:24.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:24 compute-1 sudo[88122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:24 compute-1 sudo[88122]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:24 compute-1 sudo[88122]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:24 compute-1 sudo[88171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:23:24 compute-1 sudo[88171]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:24 compute-1 sudo[88171]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:24 compute-1 sudo[88221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvovxzorgdhgsblzefabhkatfaenpfev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397403.8070235-399-63124126314421/AnsiballZ_selinux.py'
Nov 29 06:23:24 compute-1 sudo[88221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:24 compute-1 sudo[88223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:24 compute-1 sudo[88223]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:24 compute-1 sudo[88223]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:24 compute-1 sudo[88250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 _orch deploy --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 06:23:24 compute-1 sudo[88250]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:24 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:24 compute-1 python3.9[88231]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 29 06:23:24 compute-1 sudo[88221]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:25 compute-1 podman[88291]: 2025-11-29 06:23:25.041416218 +0000 UTC m=+0.056748828 container create 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Nov 29 06:23:25 compute-1 systemd[1]: Started libpod-conmon-015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae.scope.
Nov 29 06:23:25 compute-1 podman[88291]: 2025-11-29 06:23:25.016105337 +0000 UTC m=+0.031438017 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:23:25 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:23:25 compute-1 podman[88291]: 2025-11-29 06:23:25.137243855 +0000 UTC m=+0.152576505 container init 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Nov 29 06:23:25 compute-1 podman[88291]: 2025-11-29 06:23:25.14896645 +0000 UTC m=+0.164299090 container start 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:23:25 compute-1 serene_brown[88331]: 167 167
Nov 29 06:23:25 compute-1 podman[88291]: 2025-11-29 06:23:25.153590214 +0000 UTC m=+0.168922854 container attach 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 29 06:23:25 compute-1 systemd[1]: libpod-015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae.scope: Deactivated successfully.
Nov 29 06:23:25 compute-1 podman[88291]: 2025-11-29 06:23:25.154724375 +0000 UTC m=+0.170057005 container died 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:23:25 compute-1 systemd[1]: var-lib-containers-storage-overlay-500356c243a1abf867e004dd832e7f7f95ce9826e67cbb576dadfec2a7e3635d-merged.mount: Deactivated successfully.
Nov 29 06:23:25 compute-1 podman[88291]: 2025-11-29 06:23:25.209252532 +0000 UTC m=+0.224585152 container remove 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:23:25 compute-1 systemd[1]: libpod-conmon-015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae.scope: Deactivated successfully.
Nov 29 06:23:25 compute-1 ceph-mon[80754]: pgmap v286: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 122 MiB used, 21 GiB / 21 GiB avail; 2 B/s, 0 objects/s recovering
Nov 29 06:23:25 compute-1 ceph-mon[80754]: 8.1e scrub starts
Nov 29 06:23:25 compute-1 ceph-mon[80754]: 8.1e scrub ok
Nov 29 06:23:25 compute-1 ceph-mon[80754]: 11.19 scrub starts
Nov 29 06:23:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:25 compute-1 ceph-mon[80754]: pgmap v287: 305 pgs: 1 active+clean+scrubbing, 2 activating+remapped, 302 active+clean; 456 KiB data, 139 MiB used, 21 GiB / 21 GiB avail; 14 KiB/s rd, 296 B/s wr, 25 op/s; 12/214 objects misplaced (5.607%); 18 B/s, 1 objects/s recovering
Nov 29 06:23:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:25 compute-1 ceph-mon[80754]: Reconfiguring mon.compute-1 (monmap changed)...
Nov 29 06:23:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 06:23:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Nov 29 06:23:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:25 compute-1 ceph-mon[80754]: Reconfiguring daemon mon.compute-1 on compute-1
Nov 29 06:23:25 compute-1 sudo[88250]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:25.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:25 compute-1 sudo[88473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msrbjezopgxldmlcciheybuaqvxisabb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397405.4785218-432-135073714525892/AnsiballZ_command.py'
Nov 29 06:23:25 compute-1 sudo[88473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:25 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.a scrub starts
Nov 29 06:23:25 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.a scrub ok
Nov 29 06:23:26 compute-1 python3.9[88475]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 29 06:23:26 compute-1 sudo[88473]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:26.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Nov 29 06:23:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 98 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 98 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 98 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 98 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:26 compute-1 sudo[88625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yokybzzjrfzlpwrursnpfftjunsorsgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397406.3762758-456-4497752972629/AnsiballZ_file.py'
Nov 29 06:23:26 compute-1 sudo[88625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:26 compute-1 python3.9[88627]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:23:27 compute-1 sudo[88625]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:27.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:27 compute-1 sudo[88777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kencbxnaewteqcbajkaubvjrviwezlcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397407.2154555-480-259567205956360/AnsiballZ_mount.py'
Nov 29 06:23:27 compute-1 sudo[88777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:28 compute-1 python3.9[88779]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 29 06:23:28 compute-1 sudo[88777]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:28 compute-1 ceph-mon[80754]: 11.19 scrub ok
Nov 29 06:23:28 compute-1 ceph-mon[80754]: 11.e scrub starts
Nov 29 06:23:28 compute-1 ceph-mon[80754]: 11.e scrub ok
Nov 29 06:23:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:28.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:29 compute-1 sudo[88929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fynfcgsgpjkivmvpgevhosoeghacpwcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397409.2362964-564-245424303218893/AnsiballZ_file.py'
Nov 29 06:23:29 compute-1 sudo[88929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:29 compute-1 python3.9[88931]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:23:29 compute-1 sudo[88929]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:29.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Nov 29 06:23:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 99 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=6 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:23:29 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 99 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:23:30 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.b scrub starts
Nov 29 06:23:30 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.b scrub ok
Nov 29 06:23:30 compute-1 sudo[89081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcnlimbleweniwlmhoomvckqfwdojqwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397410.0363262-588-198362878573032/AnsiballZ_stat.py'
Nov 29 06:23:30 compute-1 sudo[89081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:30.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:30 compute-1 ceph-mon[80754]: 10.a scrub starts
Nov 29 06:23:30 compute-1 ceph-mon[80754]: 10.a scrub ok
Nov 29 06:23:30 compute-1 ceph-mon[80754]: pgmap v288: 305 pgs: 1 active+clean+scrubbing, 2 activating+remapped, 302 active+clean; 456 KiB data, 139 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 255 B/s wr, 22 op/s; 12/214 objects misplaced (5.607%); 15 B/s, 1 objects/s recovering
Nov 29 06:23:30 compute-1 ceph-mon[80754]: osdmap e98: 3 total, 3 up, 3 in
Nov 29 06:23:30 compute-1 ceph-mon[80754]: pgmap v290: 305 pgs: 1 active+clean+scrubbing, 2 activating+remapped, 302 active+clean; 456 KiB data, 143 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 255 B/s wr, 22 op/s; 12/214 objects misplaced (5.607%); 27 B/s, 1 objects/s recovering
Nov 29 06:23:30 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:30 compute-1 ceph-mon[80754]: 9.1 scrub starts
Nov 29 06:23:30 compute-1 ceph-mon[80754]: 9.1 scrub ok
Nov 29 06:23:30 compute-1 python3.9[89083]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:23:30 compute-1 sudo[89081]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:30 compute-1 sudo[89159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgfgblmzwhcoqictvghiawarrpofeftr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397410.0363262-588-198362878573032/AnsiballZ_file.py'
Nov 29 06:23:30 compute-1 sudo[89159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:31 compute-1 python3.9[89161]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:23:31 compute-1 sudo[89159]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:31.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:32 compute-1 sudo[89311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzdfssnskcluiexszbxqgswhmnloyofs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397412.0111034-651-275181070161065/AnsiballZ_stat.py'
Nov 29 06:23:32 compute-1 sudo[89311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:23:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:32.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:23:32 compute-1 python3.9[89313]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:23:32 compute-1 sudo[89311]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:33 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.c scrub starts
Nov 29 06:23:33 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.c scrub ok
Nov 29 06:23:33 compute-1 sudo[89465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syfunecfunqkjpbbzmcacekesbrlzfct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397413.3185859-690-256549214480339/AnsiballZ_getent.py'
Nov 29 06:23:33 compute-1 sudo[89465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:33.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:33 compute-1 python3.9[89467]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 29 06:23:33 compute-1 sudo[89465]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:34.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:34 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:35 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.d scrub starts
Nov 29 06:23:35 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.d scrub ok
Nov 29 06:23:35 compute-1 sudo[89618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwgdcqcklgyxobzobegndozkwseirbfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397414.696376-720-28371469944054/AnsiballZ_getent.py'
Nov 29 06:23:35 compute-1 sudo[89618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:35 compute-1 python3.9[89620]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 29 06:23:35 compute-1 sudo[89618]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:35.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:36 compute-1 sshd-session[89646]: Received disconnect from 45.55.249.98 port 43642:11: Bye Bye [preauth]
Nov 29 06:23:36 compute-1 sshd-session[89646]: Disconnected from authenticating user root 45.55.249.98 port 43642 [preauth]
Nov 29 06:23:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:36.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:37 compute-1 sudo[89773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbddkzzqudphbrpknzimmhwbsjapwkag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397416.0268037-744-195879176492813/AnsiballZ_group.py'
Nov 29 06:23:37 compute-1 sudo[89773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:37 compute-1 python3.9[89775]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:23:37 compute-1 sudo[89773]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:37 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Nov 29 06:23:37 compute-1 ceph-mon[80754]: 8.11 scrub starts
Nov 29 06:23:37 compute-1 ceph-mon[80754]: 8.11 scrub ok
Nov 29 06:23:37 compute-1 ceph-mon[80754]: osdmap e99: 3 total, 3 up, 3 in
Nov 29 06:23:37 compute-1 ceph-mon[80754]: 10.b scrub starts
Nov 29 06:23:37 compute-1 ceph-mon[80754]: pgmap v292: 305 pgs: 2 activating+remapped, 303 active+clean; 456 KiB data, 143 MiB used, 21 GiB / 21 GiB avail; 12 KiB/s rd, 255 B/s wr, 22 op/s; 12/214 objects misplaced (5.607%); 27 B/s, 1 objects/s recovering
Nov 29 06:23:37 compute-1 ceph-mon[80754]: 10.b scrub ok
Nov 29 06:23:37 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:37 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 06:23:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:37.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:38 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.e scrub starts
Nov 29 06:23:38 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.e scrub ok
Nov 29 06:23:38 compute-1 sudo[89877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:38 compute-1 sudo[89877]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:38 compute-1 sudo[89877]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:38 compute-1 sudo[89907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:23:38 compute-1 sudo[89907]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:38 compute-1 sudo[89907]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:38 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 100 pg[9.10( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=100) [0] r=0 lpr=100 pi=[58,100)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:38 compute-1 sudo[89998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuitgxfheizgxbqeabrybqbznxujjzag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397417.8986733-771-168104311654584/AnsiballZ_file.py'
Nov 29 06:23:38 compute-1 sudo[89998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:38 compute-1 sudo[89952]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:38 compute-1 sudo[89952]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:38 compute-1 sudo[89952]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:38 compute-1 sudo[90005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:23:38 compute-1 sudo[90005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:38 compute-1 ceph-mon[80754]: pgmap v293: 305 pgs: 305 active+clean; 456 KiB data, 143 MiB used, 21 GiB / 21 GiB avail; 27 B/s, 1 objects/s recovering
Nov 29 06:23:38 compute-1 ceph-mon[80754]: Reconfiguring mon.compute-2 (monmap changed)...
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 9.2 scrub starts
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 9.2 scrub ok
Nov 29 06:23:38 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 06:23:38 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Nov 29 06:23:38 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:38 compute-1 ceph-mon[80754]: Reconfiguring daemon mon.compute-2 on compute-2
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 8.3 scrub starts
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 10.c scrub starts
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 8.3 scrub ok
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 10.c scrub ok
Nov 29 06:23:38 compute-1 ceph-mon[80754]: pgmap v294: 305 pgs: 305 active+clean; 456 KiB data, 143 MiB used, 21 GiB / 21 GiB avail; 27 B/s, 1 objects/s recovering
Nov 29 06:23:38 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 10.12 scrub starts
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 10.12 scrub ok
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 10.d scrub starts
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 10.d scrub ok
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 9.4 scrub starts
Nov 29 06:23:38 compute-1 ceph-mon[80754]: pgmap v295: 305 pgs: 305 active+clean; 456 KiB data, 143 MiB used, 21 GiB / 21 GiB avail; 24 B/s, 1 objects/s recovering
Nov 29 06:23:38 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 9.c scrub starts
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 9.c scrub ok
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 9.4 scrub ok
Nov 29 06:23:38 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 06:23:38 compute-1 ceph-mon[80754]: osdmap e100: 3 total, 3 up, 3 in
Nov 29 06:23:38 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:38 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:38 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 10.e scrub starts
Nov 29 06:23:38 compute-1 ceph-mon[80754]: 10.e scrub ok
Nov 29 06:23:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:38.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:38 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Nov 29 06:23:38 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 101 pg[9.10( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=101) [0]/[1] r=-1 lpr=101 pi=[58,101)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:38 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 101 pg[9.10( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=101) [0]/[1] r=-1 lpr=101 pi=[58,101)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:23:38 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 101 pg[9.11( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=101) [0] r=0 lpr=101 pi=[58,101)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:38 compute-1 python3.9[90002]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 29 06:23:38 compute-1 sudo[89998]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:38 compute-1 sshd-session[89800]: Invalid user ark from 118.194.230.250 port 50470
Nov 29 06:23:38 compute-1 sshd-session[89800]: Received disconnect from 118.194.230.250 port 50470:11: Bye Bye [preauth]
Nov 29 06:23:38 compute-1 sshd-session[89800]: Disconnected from invalid user ark 118.194.230.250 port 50470 [preauth]
Nov 29 06:23:39 compute-1 podman[90127]: 2025-11-29 06:23:39.016616291 +0000 UTC m=+0.066699276 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Nov 29 06:23:39 compute-1 podman[90127]: 2025-11-29 06:23:39.142830215 +0000 UTC m=+0.192913230 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:23:39 compute-1 sudo[90005]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:39.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:39 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:40 compute-1 sudo[90373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsbyvuwgmjhwdhylkigcayhtgfjvshck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397419.531863-804-24288332912162/AnsiballZ_dnf.py'
Nov 29 06:23:40 compute-1 sudo[90373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:40 compute-1 python3.9[90375]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:23:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:40.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:41 compute-1 sudo[90373]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:41.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:42 compute-1 ceph-mon[80754]: pgmap v297: 305 pgs: 2 active+clean+scrubbing, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 27 B/s, 1 objects/s recovering
Nov 29 06:23:42 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 06:23:42 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 06:23:42 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Nov 29 06:23:42 compute-1 ceph-mon[80754]: osdmap e101: 3 total, 3 up, 3 in
Nov 29 06:23:42 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Nov 29 06:23:42 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 102 pg[9.11( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=102) [0]/[1] r=-1 lpr=102 pi=[58,102)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:42 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 102 pg[9.11( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=102) [0]/[1] r=-1 lpr=102 pi=[58,102)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:23:42 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.16 deep-scrub starts
Nov 29 06:23:42 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.16 deep-scrub ok
Nov 29 06:23:42 compute-1 sudo[90401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:42 compute-1 sudo[90401]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:42 compute-1 sudo[90401]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:42 compute-1 sudo[90426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:23:42 compute-1 sudo[90426]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:42 compute-1 sudo[90426]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:42.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:42 compute-1 sudo[90451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:23:42 compute-1 sudo[90451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:42 compute-1 sudo[90451]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:42 compute-1 sudo[90476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:23:42 compute-1 sudo[90476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:23:42 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Nov 29 06:23:42 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 103 pg[9.10( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=101/58 les/c/f=102/59/0 sis=103) [0] r=0 lpr=103 pi=[58,103)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:42 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 103 pg[9.10( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=101/58 les/c/f=102/59/0 sis=103) [0] r=0 lpr=103 pi=[58,103)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:42 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 103 pg[9.12( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=103) [0] r=0 lpr=103 pi=[58,103)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:43 compute-1 sudo[90476]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:43 compute-1 ceph-mon[80754]: pgmap v299: 305 pgs: 2 active+clean+scrubbing, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Nov 29 06:23:43 compute-1 ceph-mon[80754]: 10.1e scrub starts
Nov 29 06:23:43 compute-1 ceph-mon[80754]: 10.1e scrub ok
Nov 29 06:23:43 compute-1 ceph-mon[80754]: 9.12 scrub starts
Nov 29 06:23:43 compute-1 ceph-mon[80754]: 9.12 scrub ok
Nov 29 06:23:43 compute-1 ceph-mon[80754]: pgmap v300: 305 pgs: 1 unknown, 1 remapped+peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:43 compute-1 ceph-mon[80754]: 11.13 scrub starts
Nov 29 06:23:43 compute-1 ceph-mon[80754]: 11.13 scrub ok
Nov 29 06:23:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-1 ceph-mon[80754]: osdmap e102: 3 total, 3 up, 3 in
Nov 29 06:23:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-1 ceph-mon[80754]: 10.16 deep-scrub starts
Nov 29 06:23:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-1 ceph-mon[80754]: 10.16 deep-scrub ok
Nov 29 06:23:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Nov 29 06:23:43 compute-1 ceph-mon[80754]: osdmap e103: 3 total, 3 up, 3 in
Nov 29 06:23:43 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.17 deep-scrub starts
Nov 29 06:23:43 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.17 deep-scrub ok
Nov 29 06:23:43 compute-1 sudo[90656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deaamupdsnrfrfubcajzvxjpttfgeqgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397423.17434-828-257043890809776/AnsiballZ_file.py'
Nov 29 06:23:43 compute-1 sudo[90656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:43 compute-1 python3.9[90658]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:23:43 compute-1 sudo[90656]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:23:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:43.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:23:44 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1a scrub starts
Nov 29 06:23:44 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1a scrub ok
Nov 29 06:23:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:44.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:44 compute-1 sudo[90808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdlmopoutctrhyprhatnfglmnavcqryl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397424.0771222-852-149271454933694/AnsiballZ_stat.py'
Nov 29 06:23:44 compute-1 sudo[90808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:44 compute-1 python3.9[90810]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:23:44 compute-1 sudo[90808]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:44 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:44 compute-1 sudo[90887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gclpmheyyywswxcgqycunnlknulonfyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397424.0771222-852-149271454933694/AnsiballZ_file.py'
Nov 29 06:23:44 compute-1 sudo[90887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:45 compute-1 python3.9[90889]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:23:45 compute-1 sudo[90887]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:23:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:45.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:23:46 compute-1 sudo[91039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktngtjwldzdtywcgefkkpwocrpzfgeia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397425.917787-892-47371675931882/AnsiballZ_stat.py'
Nov 29 06:23:46 compute-1 sudo[91039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:46.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:46 compute-1 python3.9[91041]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:23:46 compute-1 sudo[91039]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:46 compute-1 sudo[91117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pleqkjqiixnscphcwrwccexldhfpoqdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397425.917787-892-47371675931882/AnsiballZ_file.py'
Nov 29 06:23:46 compute-1 sudo[91117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:47 compute-1 python3.9[91119]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:23:47 compute-1 sudo[91117]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Nov 29 06:23:47 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 104 pg[9.11( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=102/58 les/c/f=103/59/0 sis=104) [0] r=0 lpr=104 pi=[58,104)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:47 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 104 pg[9.11( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=102/58 les/c/f=103/59/0 sis=104) [0] r=0 lpr=104 pi=[58,104)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:47 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 104 pg[9.12( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=104) [0]/[1] r=-1 lpr=104 pi=[58,104)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:47 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 104 pg[9.12( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=104) [0]/[1] r=-1 lpr=104 pi=[58,104)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:23:47 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 104 pg[9.10( v 56'1130 (0'0,56'1130] local-lis/les=103/104 n=6 ec=58/47 lis/c=101/58 les/c/f=102/59/0 sis=103) [0] r=0 lpr=103 pi=[58,103)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:23:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:23:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:47.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:23:47 compute-1 sudo[91269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzmkghuhginjmllclkpctpcktibdcuqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397427.6221561-936-58018858751216/AnsiballZ_dnf.py'
Nov 29 06:23:47 compute-1 sudo[91269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:48 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1c scrub starts
Nov 29 06:23:48 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1c scrub ok
Nov 29 06:23:48 compute-1 python3.9[91271]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:23:48 compute-1 ceph-mon[80754]: 9.14 scrub starts
Nov 29 06:23:48 compute-1 ceph-mon[80754]: 9.14 scrub ok
Nov 29 06:23:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:23:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:23:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:23:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:23:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:23:48 compute-1 ceph-mon[80754]: 10.17 deep-scrub starts
Nov 29 06:23:48 compute-1 ceph-mon[80754]: 10.17 deep-scrub ok
Nov 29 06:23:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:48.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:49 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1d scrub starts
Nov 29 06:23:49 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1d scrub ok
Nov 29 06:23:49 compute-1 sshd-session[90112]: error: kex_exchange_identification: read: Connection timed out
Nov 29 06:23:49 compute-1 sshd-session[90112]: banner exchange: Connection from 119.45.242.7 port 50056: Connection timed out
Nov 29 06:23:49 compute-1 sudo[91269]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:23:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:49.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:23:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Nov 29 06:23:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:50.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 105 pg[9.11( v 56'1130 (0'0,56'1130] local-lis/les=104/105 n=6 ec=58/47 lis/c=102/58 les/c/f=103/59/0 sis=104) [0] r=0 lpr=104 pi=[58,104)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:23:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1f scrub starts
Nov 29 06:23:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1f scrub ok
Nov 29 06:23:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:23:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:51.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:23:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:23:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:52.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:23:52 compute-1 python3.9[91422]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:23:53 compute-1 python3.9[91574]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 29 06:23:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:53.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:54.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:23:55 compute-1 ceph-mon[80754]: pgmap v303: 305 pgs: 1 unknown, 1 remapped+peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:55 compute-1 ceph-mon[80754]: 10.1a scrub starts
Nov 29 06:23:55 compute-1 ceph-mon[80754]: 10.1a scrub ok
Nov 29 06:23:55 compute-1 ceph-mon[80754]: 9.1c scrub starts
Nov 29 06:23:55 compute-1 ceph-mon[80754]: 9.1c scrub ok
Nov 29 06:23:55 compute-1 ceph-mon[80754]: pgmap v304: 305 pgs: 1 unknown, 1 remapped+peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:23:55 compute-1 ceph-mon[80754]: osdmap e104: 3 total, 3 up, 3 in
Nov 29 06:23:55 compute-1 ceph-mon[80754]: 11.2 scrub starts
Nov 29 06:23:55 compute-1 ceph-mon[80754]: 11.2 scrub ok
Nov 29 06:23:55 compute-1 ceph-mon[80754]: pgmap v306: 305 pgs: 1 unknown, 1 active+remapped, 1 peering, 302 active+clean; 455 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 18 B/s, 1 objects/s recovering
Nov 29 06:23:55 compute-1 ceph-mon[80754]: 10.1c scrub starts
Nov 29 06:23:55 compute-1 ceph-mon[80754]: 10.1c scrub ok
Nov 29 06:23:55 compute-1 python3.9[91724]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:23:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:23:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:55.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:23:56 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.10 scrub starts
Nov 29 06:23:56 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.10 scrub ok
Nov 29 06:23:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:23:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:56.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 10.1d scrub starts
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 10.1d scrub ok
Nov 29 06:23:56 compute-1 ceph-mon[80754]: pgmap v307: 305 pgs: 1 unknown, 1 active+remapped, 1 peering, 302 active+clean; 455 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 13 B/s, 0 objects/s recovering
Nov 29 06:23:56 compute-1 ceph-mon[80754]: osdmap e105: 3 total, 3 up, 3 in
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 10.1f scrub starts
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 10.1f scrub ok
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 8.1c scrub starts
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 8.1c scrub ok
Nov 29 06:23:56 compute-1 ceph-mon[80754]: pgmap v309: 305 pgs: 1 activating+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 4/219 objects misplaced (1.826%); 13 B/s, 0 objects/s recovering
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 11.6 scrub starts
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 8.1f scrub starts
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 8.1f scrub ok
Nov 29 06:23:56 compute-1 ceph-mon[80754]: pgmap v310: 305 pgs: 1 activating+remapped, 304 active+clean; 455 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 4/219 objects misplaced (1.826%); 13 B/s, 0 objects/s recovering
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 11.9 scrub starts
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 8.c scrub starts
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 8.c scrub ok
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 11.6 scrub ok
Nov 29 06:23:56 compute-1 ceph-mon[80754]: 11.9 scrub ok
Nov 29 06:23:56 compute-1 sudo[91874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzjzwdrpdbfhpoycwmhxuqqqezvsrmgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397436.1395445-1059-256952977639511/AnsiballZ_systemd.py'
Nov 29 06:23:56 compute-1 sudo[91874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:57 compute-1 python3.9[91876]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:23:57 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 29 06:23:57 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Nov 29 06:23:57 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 29 06:23:57 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.14 scrub starts
Nov 29 06:23:57 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 06:23:57 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.14 scrub ok
Nov 29 06:23:57 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 06:23:57 compute-1 sudo[91874]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:23:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:57.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:23:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:23:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:58.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:23:59 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.17 scrub starts
Nov 29 06:23:59 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.17 scrub ok
Nov 29 06:23:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Nov 29 06:23:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 106 pg[9.12( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=4 ec=58/47 lis/c=104/58 les/c/f=105/59/0 sis=106) [0] r=0 lpr=106 pi=[58,106)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:23:59 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 106 pg[9.12( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=4 ec=58/47 lis/c=104/58 les/c/f=105/59/0 sis=106) [0] r=0 lpr=106 pi=[58,106)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:23:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:23:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:23:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:59.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:23:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:00 compute-1 python3.9[92039]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 29 06:24:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:24:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:00.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:24:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:24:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:01.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:24:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:24:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:02.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:24:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:24:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:03.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:24:04 compute-1 ceph-mon[80754]: 11.17 scrub starts
Nov 29 06:24:04 compute-1 ceph-mon[80754]: 11.17 scrub ok
Nov 29 06:24:04 compute-1 ceph-mon[80754]: pgmap v311: 305 pgs: 1 activating+remapped, 304 active+clean; 455 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 4/219 objects misplaced (1.826%)
Nov 29 06:24:04 compute-1 ceph-mon[80754]: 8.10 scrub starts
Nov 29 06:24:04 compute-1 ceph-mon[80754]: 8.10 scrub ok
Nov 29 06:24:04 compute-1 ceph-mon[80754]: 11.b scrub starts
Nov 29 06:24:04 compute-1 ceph-mon[80754]: 11.b scrub ok
Nov 29 06:24:04 compute-1 ceph-mon[80754]: 11.14 scrub starts
Nov 29 06:24:04 compute-1 ceph-mon[80754]: 11.14 scrub ok
Nov 29 06:24:04 compute-1 ceph-mon[80754]: 11.c scrub starts
Nov 29 06:24:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:04.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:04 compute-1 sudo[92189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feaviocszafayfiibkqabaqsqisapisd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397444.2262561-1230-275082916958274/AnsiballZ_systemd.py'
Nov 29 06:24:04 compute-1 sudo[92189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:04 compute-1 python3.9[92191]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:24:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:04 compute-1 sudo[92189]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Nov 29 06:24:05 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1e scrub starts
Nov 29 06:24:05 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1e scrub ok
Nov 29 06:24:05 compute-1 sudo[92343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unsvnspctgnmdvvfldviqsbwqgnvkfow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397445.1604476-1230-118170517859049/AnsiballZ_systemd.py'
Nov 29 06:24:05 compute-1 sudo[92343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:05 compute-1 python3.9[92345]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:24:05 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 107 pg[9.12( v 56'1130 (0'0,56'1130] local-lis/les=106/107 n=4 ec=58/47 lis/c=104/58 les/c/f=105/59/0 sis=106) [0] r=0 lpr=106 pi=[58,106)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:24:05 compute-1 sudo[92343]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:05 compute-1 ceph-mon[80754]: pgmap v312: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 11.c scrub ok
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 9.17 deep-scrub starts
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 9.17 deep-scrub ok
Nov 29 06:24:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 8.17 scrub starts
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 8.17 scrub ok
Nov 29 06:24:05 compute-1 ceph-mon[80754]: pgmap v313: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:24:05 compute-1 ceph-mon[80754]: osdmap e106: 3 total, 3 up, 3 in
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 11.d scrub starts
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 9.1b scrub starts
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 9.1b scrub ok
Nov 29 06:24:05 compute-1 ceph-mon[80754]: pgmap v315: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 11.d scrub ok
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 11.10 scrub starts
Nov 29 06:24:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 06:24:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 9.7 deep-scrub starts
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 9.7 deep-scrub ok
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 11.10 scrub ok
Nov 29 06:24:05 compute-1 ceph-mon[80754]: pgmap v316: 305 pgs: 1 active+clean+scrubbing, 1 active+remapped, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 9.b scrub starts
Nov 29 06:24:05 compute-1 ceph-mon[80754]: 9.b scrub ok
Nov 29 06:24:05 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 06:24:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:24:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:05.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:24:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:06.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:07 compute-1 sshd-session[84831]: Connection closed by 192.168.122.30 port 49582
Nov 29 06:24:07 compute-1 sshd-session[84828]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:24:07 compute-1 systemd[1]: session-34.scope: Deactivated successfully.
Nov 29 06:24:07 compute-1 systemd[1]: session-34.scope: Consumed 1min 12.068s CPU time.
Nov 29 06:24:07 compute-1 systemd-logind[785]: Session 34 logged out. Waiting for processes to exit.
Nov 29 06:24:07 compute-1 systemd-logind[785]: Removed session 34.
Nov 29 06:24:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:07.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:07 compute-1 sudo[92372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:24:07 compute-1 sudo[92372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:24:07 compute-1 sudo[92372]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:08 compute-1 sudo[92397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:24:08 compute-1 sudo[92397]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:24:08 compute-1 sudo[92397]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:08.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:09.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:09 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:24:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:10.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:24:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Nov 29 06:24:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 06:24:11 compute-1 ceph-mon[80754]: 11.1e scrub starts
Nov 29 06:24:11 compute-1 ceph-mon[80754]: 11.1e scrub ok
Nov 29 06:24:11 compute-1 ceph-mon[80754]: 11.11 scrub starts
Nov 29 06:24:11 compute-1 ceph-mon[80754]: 11.11 scrub ok
Nov 29 06:24:11 compute-1 ceph-mon[80754]: osdmap e107: 3 total, 3 up, 3 in
Nov 29 06:24:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:24:11 compute-1 ceph-mon[80754]: pgmap v318: 305 pgs: 1 active+clean+scrubbing, 1 active+remapped, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 06:24:11 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.7 deep-scrub starts
Nov 29 06:24:11 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.7 deep-scrub ok
Nov 29 06:24:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:11.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:24:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:12.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:24:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:24:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:13.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:24:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Nov 29 06:24:14 compute-1 ceph-mon[80754]: 11.15 deep-scrub starts
Nov 29 06:24:14 compute-1 ceph-mon[80754]: 11.15 deep-scrub ok
Nov 29 06:24:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:24:14 compute-1 ceph-mon[80754]: pgmap v319: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 06:24:14 compute-1 ceph-mon[80754]: pgmap v320: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 06:24:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 06:24:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 06:24:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 06:24:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 06:24:14 compute-1 ceph-mon[80754]: 11.18 scrub starts
Nov 29 06:24:14 compute-1 ceph-mon[80754]: osdmap e108: 3 total, 3 up, 3 in
Nov 29 06:24:14 compute-1 ceph-mon[80754]: 11.7 deep-scrub starts
Nov 29 06:24:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:14.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:15 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.1b scrub starts
Nov 29 06:24:15 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.1b scrub ok
Nov 29 06:24:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:15.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:16 compute-1 sshd-session[92422]: Accepted publickey for zuul from 192.168.122.30 port 56162 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:24:16 compute-1 systemd-logind[785]: New session 35 of user zuul.
Nov 29 06:24:16 compute-1 systemd[1]: Started Session 35 of User zuul.
Nov 29 06:24:16 compute-1 sshd-session[92422]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:24:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:16.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:17 compute-1 python3.9[92575]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:24:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:24:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:17.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:24:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:24:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:18.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:24:18 compute-1 sudo[92729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slynffpuvleymadacjfwafgpdxgsxuku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397458.3988595-74-144600000383770/AnsiballZ_getent.py'
Nov 29 06:24:18 compute-1 sudo[92729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:19 compute-1 python3.9[92731]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 29 06:24:19 compute-1 sudo[92729]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:19 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.4 deep-scrub starts
Nov 29 06:24:19 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.4 deep-scrub ok
Nov 29 06:24:19 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Nov 29 06:24:19 compute-1 ceph-mon[80754]: 11.18 scrub ok
Nov 29 06:24:19 compute-1 ceph-mon[80754]: 9.13 scrub starts
Nov 29 06:24:19 compute-1 ceph-mon[80754]: 9.13 scrub ok
Nov 29 06:24:19 compute-1 ceph-mon[80754]: 11.7 deep-scrub ok
Nov 29 06:24:19 compute-1 ceph-mon[80754]: pgmap v322: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 06:24:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 06:24:19 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 110 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=110) [0] r=0 lpr=110 pi=[77,110)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:24:19 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:24:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:19.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:24:20 compute-1 sudo[92882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdrplrwpatyhdeezbhymxgtcfhkulxht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397459.6673234-110-105569316677012/AnsiballZ_setup.py'
Nov 29 06:24:20 compute-1 sudo[92882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:20 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.12 scrub starts
Nov 29 06:24:20 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.12 scrub ok
Nov 29 06:24:20 compute-1 python3.9[92884]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:24:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:24:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:20.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:24:20 compute-1 sudo[92882]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Nov 29 06:24:21 compute-1 sudo[92966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uemgocrdqqcdsdqzrmnwflealfyavcjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397459.6673234-110-105569316677012/AnsiballZ_dnf.py'
Nov 29 06:24:21 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 111 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=111) [0]/[2] r=-1 lpr=111 pi=[77,111)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:21 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 111 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=111) [0]/[2] r=-1 lpr=111 pi=[77,111)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:21 compute-1 sudo[92966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:21 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 111 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=111 pruub=13.241782188s) [2] r=-1 lpr=111 pi=[78,111)/1 crt=56'1130 mlcod 0'0 active pruub 358.085052490s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:21 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 111 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=111 pruub=13.241716385s) [2] r=-1 lpr=111 pi=[78,111)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 358.085052490s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:21 compute-1 python3.9[92968]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:24:21 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1d scrub starts
Nov 29 06:24:21 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1d scrub ok
Nov 29 06:24:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:21.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:22 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.8 scrub starts
Nov 29 06:24:22 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.8 scrub ok
Nov 29 06:24:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:22.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:22 compute-1 sudo[92966]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:23.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 06:24:24 compute-1 ceph-mon[80754]: osdmap e109: 3 total, 3 up, 3 in
Nov 29 06:24:24 compute-1 ceph-mon[80754]: pgmap v324: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 9.3 scrub starts
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 9.3 scrub ok
Nov 29 06:24:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 8.1b scrub starts
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 8.1b scrub ok
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 11.1f deep-scrub starts
Nov 29 06:24:24 compute-1 ceph-mon[80754]: pgmap v325: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 9.15 scrub starts
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 9.15 scrub ok
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 10.8 deep-scrub starts
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 9.5 scrub starts
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 9.5 scrub ok
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 11.1f deep-scrub ok
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 10.8 deep-scrub ok
Nov 29 06:24:24 compute-1 ceph-mon[80754]: pgmap v326: 305 pgs: 2 active+clean+scrubbing+deep, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 9.9 scrub starts
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 9.9 scrub ok
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 10.14 deep-scrub starts
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 9.19 scrub starts
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 8.4 deep-scrub starts
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 8.4 deep-scrub ok
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 9.19 scrub ok
Nov 29 06:24:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 10.14 deep-scrub ok
Nov 29 06:24:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 06:24:24 compute-1 ceph-mon[80754]: osdmap e110: 3 total, 3 up, 3 in
Nov 29 06:24:24 compute-1 ceph-mon[80754]: pgmap v328: 305 pgs: 2 active+clean+scrubbing+deep, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 8.12 scrub starts
Nov 29 06:24:24 compute-1 ceph-mon[80754]: 8.12 scrub ok
Nov 29 06:24:24 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.f scrub starts
Nov 29 06:24:24 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.f scrub ok
Nov 29 06:24:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:24.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:24 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Nov 29 06:24:25 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 112 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=112) [2]/[0] r=0 lpr=112 pi=[78,112)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:25 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 112 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=112) [2]/[0] r=0 lpr=112 pi=[78,112)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:24:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:25.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:26 compute-1 sudo[93119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esnxvytvuygqxbznsuizbuewxoquyzag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397465.714007-152-167919787553833/AnsiballZ_dnf.py'
Nov 29 06:24:26 compute-1 sudo[93119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:26 compute-1 python3.9[93121]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:24:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:24:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:26.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:24:27 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.14 scrub starts
Nov 29 06:24:27 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.14 scrub ok
Nov 29 06:24:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:27.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:28 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.4 scrub starts
Nov 29 06:24:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 06:24:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:28.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 06:24:28 compute-1 sudo[93119]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:28 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.4 scrub ok
Nov 29 06:24:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 06:24:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 06:24:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 06:24:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 10.13 deep-scrub starts
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 10.13 deep-scrub ok
Nov 29 06:24:29 compute-1 ceph-mon[80754]: osdmap e111: 3 total, 3 up, 3 in
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 11.1d scrub starts
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 11.1d scrub ok
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 9.8 scrub starts
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 10.5 scrub starts
Nov 29 06:24:29 compute-1 ceph-mon[80754]: pgmap v330: 305 pgs: 2 active+clean+scrubbing+deep, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 9.8 scrub ok
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 10.5 scrub ok
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 8.8 scrub starts
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 8.8 scrub ok
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 9.18 scrub starts
Nov 29 06:24:29 compute-1 ceph-mon[80754]: pgmap v331: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 9.18 scrub ok
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 10.1b scrub starts
Nov 29 06:24:29 compute-1 ceph-mon[80754]: 10.1b scrub ok
Nov 29 06:24:29 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Nov 29 06:24:29 compute-1 ceph-mon[80754]: osdmap e112: 3 total, 3 up, 3 in
Nov 29 06:24:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:29.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:30 compute-1 sudo[93272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txybflczsbrvhqtwfrbzhptcbrqgcqyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397469.2523167-176-178205903353860/AnsiballZ_systemd.py'
Nov 29 06:24:30 compute-1 sudo[93272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:30 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1a scrub starts
Nov 29 06:24:30 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1a scrub ok
Nov 29 06:24:30 compute-1 python3.9[93274]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:24:30 compute-1 sudo[93272]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 06:24:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:30.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 06:24:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Nov 29 06:24:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 113 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=4 ec=58/47 lis/c=111/77 les/c/f=112/79/0 sis=113) [0] r=0 lpr=113 pi=[77,113)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:30 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 113 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=4 ec=58/47 lis/c=111/77 les/c/f=112/79/0 sis=113) [0] r=0 lpr=113 pi=[77,113)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 06:24:31 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 113 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=112/113 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=112) [2]/[0] async=[2] r=0 lpr=112 pi=[78,112)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 11.f scrub starts
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 11.f scrub ok
Nov 29 06:24:31 compute-1 ceph-mon[80754]: pgmap v333: 305 pgs: 1 active+clean+scrubbing, 1 unknown, 1 remapped+peering, 302 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 10.18 scrub starts
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 10.18 scrub ok
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 10.2 scrub starts
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 10.2 scrub ok
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 8.14 scrub starts
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 8.14 scrub ok
Nov 29 06:24:31 compute-1 ceph-mon[80754]: pgmap v334: 305 pgs: 1 active+clean+scrubbing, 1 unknown, 1 remapped+peering, 302 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 10.19 scrub starts
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 10.19 scrub ok
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 11.4 scrub starts
Nov 29 06:24:31 compute-1 ceph-mon[80754]: 11.4 scrub ok
Nov 29 06:24:31 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.19 scrub starts
Nov 29 06:24:31 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.19 scrub ok
Nov 29 06:24:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:31.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:32 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1c scrub starts
Nov 29 06:24:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:32.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:33 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.5 scrub starts
Nov 29 06:24:33 compute-1 python3.9[93427]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:24:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:33.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:34 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.5 scrub ok
Nov 29 06:24:34 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1c scrub ok
Nov 29 06:24:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:34.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:34 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Nov 29 06:24:34 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 114 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=112/113 n=5 ec=58/47 lis/c=112/78 les/c/f=113/79/0 sis=114 pruub=12.418749809s) [2] async=[2] r=-1 lpr=114 pi=[78,114)/1 crt=56'1130 mlcod 56'1130 active pruub 370.969604492s@ mbc={255={}}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:34 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 114 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=112/113 n=5 ec=58/47 lis/c=112/78 les/c/f=113/79/0 sis=114 pruub=12.417335510s) [2] r=-1 lpr=114 pi=[78,114)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 370.969604492s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:34 compute-1 sudo[93578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shleujphsvjcupbjkveatbepaxkdoexz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397474.2784965-230-165222784228108/AnsiballZ_sefcontext.py'
Nov 29 06:24:34 compute-1 sudo[93578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:35 compute-1 python3.9[93580]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 29 06:24:35 compute-1 ceph-mon[80754]: pgmap v335: 305 pgs: 1 active+remapped, 1 unknown, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:35 compute-1 ceph-mon[80754]: 11.1a scrub starts
Nov 29 06:24:35 compute-1 ceph-mon[80754]: 11.1a scrub ok
Nov 29 06:24:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Nov 29 06:24:35 compute-1 ceph-mon[80754]: osdmap e113: 3 total, 3 up, 3 in
Nov 29 06:24:35 compute-1 ceph-mon[80754]: 8.19 scrub starts
Nov 29 06:24:35 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1 deep-scrub starts
Nov 29 06:24:35 compute-1 sudo[93578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:35 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 114 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=113/114 n=4 ec=58/47 lis/c=111/77 les/c/f=112/79/0 sis=113) [0] r=0 lpr=113 pi=[77,113)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:24:35 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1 deep-scrub ok
Nov 29 06:24:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:35.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:36 compute-1 python3.9[93730]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:24:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:24:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:36.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:24:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Nov 29 06:24:37 compute-1 ceph-mon[80754]: 8.19 scrub ok
Nov 29 06:24:37 compute-1 ceph-mon[80754]: pgmap v337: 305 pgs: 1 active+remapped, 1 peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:37 compute-1 ceph-mon[80754]: 10.15 scrub starts
Nov 29 06:24:37 compute-1 ceph-mon[80754]: 10.15 scrub ok
Nov 29 06:24:37 compute-1 ceph-mon[80754]: 11.1c scrub starts
Nov 29 06:24:37 compute-1 ceph-mon[80754]: 11.5 scrub starts
Nov 29 06:24:37 compute-1 ceph-mon[80754]: pgmap v338: 305 pgs: 1 active+remapped, 1 peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 0 B/s, 0 objects/s recovering
Nov 29 06:24:37 compute-1 ceph-mon[80754]: osdmap e114: 3 total, 3 up, 3 in
Nov 29 06:24:37 compute-1 ceph-mon[80754]: 11.5 scrub ok
Nov 29 06:24:37 compute-1 ceph-mon[80754]: 11.1c scrub ok
Nov 29 06:24:37 compute-1 ceph-mon[80754]: 11.1 deep-scrub starts
Nov 29 06:24:37 compute-1 sudo[93886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtdxiuokoujxmvovfzpaqxnkhykkliwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397477.001359-284-114354330888265/AnsiballZ_dnf.py'
Nov 29 06:24:37 compute-1 sudo[93886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:37 compute-1 python3.9[93888]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:24:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:37.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:37 compute-1 ceph-mon[80754]: 11.1 deep-scrub ok
Nov 29 06:24:37 compute-1 ceph-mon[80754]: osdmap e115: 3 total, 3 up, 3 in
Nov 29 06:24:37 compute-1 ceph-mon[80754]: pgmap v341: 305 pgs: 1 active+remapped, 1 peering, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 8.2 KiB/s rd, 170 B/s wr, 14 op/s; 36 B/s, 1 objects/s recovering
Nov 29 06:24:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:38.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:39 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Nov 29 06:24:39 compute-1 sudo[93886]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:39 compute-1 ceph-mon[80754]: pgmap v342: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 6.5 KiB/s rd, 0 B/s wr, 11 op/s; 29 B/s, 0 objects/s recovering
Nov 29 06:24:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Nov 29 06:24:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:39.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:40 compute-1 sudo[94039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpolrtihcqcudpcwkootdrlxidhyxybs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397479.697954-308-91582904403348/AnsiballZ_command.py'
Nov 29 06:24:40 compute-1 sudo[94039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:40 compute-1 python3.9[94041]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:24:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:40.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Nov 29 06:24:40 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Nov 29 06:24:40 compute-1 ceph-mon[80754]: osdmap e116: 3 total, 3 up, 3 in
Nov 29 06:24:40 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Nov 29 06:24:40 compute-1 sshd-session[94042]: Received disconnect from 71.70.164.48 port 49053:11: Bye Bye [preauth]
Nov 29 06:24:40 compute-1 sshd-session[94042]: Disconnected from authenticating user root 71.70.164.48 port 49053 [preauth]
Nov 29 06:24:41 compute-1 sudo[94039]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:41 compute-1 ceph-mon[80754]: pgmap v344: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:41 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Nov 29 06:24:41 compute-1 ceph-mon[80754]: osdmap e117: 3 total, 3 up, 3 in
Nov 29 06:24:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Nov 29 06:24:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:41.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:42 compute-1 sudo[94328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzhjugfqirscybnhycrxhkjhshrbvynl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397481.5483966-332-279361042966577/AnsiballZ_file.py'
Nov 29 06:24:42 compute-1 sudo[94328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:42 compute-1 python3.9[94330]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 06:24:42 compute-1 sudo[94328]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:42 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.12 scrub starts
Nov 29 06:24:42 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.12 scrub ok
Nov 29 06:24:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:42.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:42 compute-1 ceph-mon[80754]: osdmap e118: 3 total, 3 up, 3 in
Nov 29 06:24:42 compute-1 ceph-mon[80754]: pgmap v347: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:42 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Nov 29 06:24:42 compute-1 ceph-mon[80754]: 9.16 scrub starts
Nov 29 06:24:42 compute-1 ceph-mon[80754]: 9.16 scrub ok
Nov 29 06:24:43 compute-1 python3.9[94480]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:24:43 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Nov 29 06:24:43 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 119 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=5 ec=58/47 lis/c=86/86 les/c/f=87/87/0 sis=119 pruub=8.766153336s) [1] r=-1 lpr=119 pi=[86,119)/1 crt=56'1130 mlcod 0'0 active pruub 376.257568359s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:43 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 119 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=5 ec=58/47 lis/c=86/86 les/c/f=87/87/0 sis=119 pruub=8.766060829s) [1] r=-1 lpr=119 pi=[86,119)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 376.257568359s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:43.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:44 compute-1 ceph-mon[80754]: 11.12 scrub starts
Nov 29 06:24:44 compute-1 ceph-mon[80754]: 11.12 scrub ok
Nov 29 06:24:44 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Nov 29 06:24:44 compute-1 ceph-mon[80754]: osdmap e119: 3 total, 3 up, 3 in
Nov 29 06:24:44 compute-1 sudo[94633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgdsqpvaphpkwsubsyqrioolkhpvypdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397483.9126103-380-139594713854980/AnsiballZ_dnf.py'
Nov 29 06:24:44 compute-1 sudo[94633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:44 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Nov 29 06:24:44 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 120 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=5 ec=58/47 lis/c=86/86 les/c/f=87/87/0 sis=120) [1]/[0] r=0 lpr=120 pi=[86,120)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:44 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 120 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=5 ec=58/47 lis/c=86/86 les/c/f=87/87/0 sis=120) [1]/[0] r=0 lpr=120 pi=[86,120)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:24:44 compute-1 sshd-session[94627]: Connection closed by 80.94.92.182 port 43532
Nov 29 06:24:44 compute-1 python3.9[94635]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:24:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:44.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:45.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:46 compute-1 ceph-mon[80754]: pgmap v349: 305 pgs: 1 active+recovering+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 6/212 objects misplaced (2.830%)
Nov 29 06:24:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Nov 29 06:24:46 compute-1 ceph-mon[80754]: osdmap e120: 3 total, 3 up, 3 in
Nov 29 06:24:46 compute-1 sudo[94633]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Nov 29 06:24:46 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 121 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=120/121 n=5 ec=58/47 lis/c=86/86 les/c/f=87/87/0 sis=120) [1]/[0] async=[1] r=0 lpr=120 pi=[86,120)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:24:46 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1b scrub starts
Nov 29 06:24:46 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1b scrub ok
Nov 29 06:24:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:46.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:46 compute-1 sudo[94786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sncnwvtevrbaieeyjzdfhecpvyzxsepp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397486.462125-407-153889475381994/AnsiballZ_dnf.py'
Nov 29 06:24:46 compute-1 sudo[94786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:47 compute-1 python3.9[94788]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:24:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:47.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:48 compute-1 ceph-mon[80754]: pgmap v351: 305 pgs: 1 active+recovering+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 6/212 objects misplaced (2.830%)
Nov 29 06:24:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Nov 29 06:24:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Nov 29 06:24:48 compute-1 ceph-mon[80754]: osdmap e121: 3 total, 3 up, 3 in
Nov 29 06:24:48 compute-1 sudo[94786]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:48 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.18 scrub starts
Nov 29 06:24:48 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.18 scrub ok
Nov 29 06:24:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:48.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:49 compute-1 sshd-session[94814]: Invalid user apagar from 45.55.249.98 port 33804
Nov 29 06:24:49 compute-1 sshd-session[94814]: Received disconnect from 45.55.249.98 port 33804:11: Bye Bye [preauth]
Nov 29 06:24:49 compute-1 sshd-session[94814]: Disconnected from invalid user apagar 45.55.249.98 port 33804 [preauth]
Nov 29 06:24:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:49.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:50 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.e scrub starts
Nov 29 06:24:50 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.e scrub ok
Nov 29 06:24:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:50.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.1e scrub starts
Nov 29 06:24:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.1e scrub ok
Nov 29 06:24:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Nov 29 06:24:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 122 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=120/121 n=5 ec=58/47 lis/c=120/86 les/c/f=121/87/0 sis=122 pruub=10.483498573s) [1] async=[1] r=-1 lpr=122 pi=[86,122)/1 crt=56'1130 mlcod 56'1130 active pruub 386.142822266s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:24:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 122 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=120/121 n=5 ec=58/47 lis/c=120/86 les/c/f=121/87/0 sis=122 pruub=10.483215332s) [1] r=-1 lpr=122 pi=[86,122)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 386.142822266s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:24:51 compute-1 ceph-mon[80754]: 11.1b scrub starts
Nov 29 06:24:51 compute-1 ceph-mon[80754]: 11.1b scrub ok
Nov 29 06:24:51 compute-1 ceph-mon[80754]: pgmap v353: 305 pgs: 1 unknown, 1 active+remapped, 1 peering, 302 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 0 B/s, 0 objects/s recovering
Nov 29 06:24:51 compute-1 ceph-mon[80754]: 8.18 scrub starts
Nov 29 06:24:51 compute-1 ceph-mon[80754]: 8.18 scrub ok
Nov 29 06:24:51 compute-1 ceph-mon[80754]: pgmap v354: 305 pgs: 1 unknown, 1 active+remapped, 1 peering, 302 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 0 B/s, 0 objects/s recovering
Nov 29 06:24:51 compute-1 ceph-mon[80754]: 9.e scrub starts
Nov 29 06:24:51 compute-1 ceph-mon[80754]: 9.e scrub ok
Nov 29 06:24:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:24:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:51.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:24:52 compute-1 sudo[94941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmdwhyflvsvbvprmpjojfvwigrwekdtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397492.1810884-443-163922556582773/AnsiballZ_stat.py'
Nov 29 06:24:52 compute-1 sudo[94941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:52.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:52 compute-1 python3.9[94943]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:24:52 compute-1 sudo[94941]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:53 compute-1 sudo[95095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsrllzpkcpcqkqqerkbcyldpdkidwkzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397492.9459157-467-262824654328839/AnsiballZ_slurp.py'
Nov 29 06:24:53 compute-1 sudo[95095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:53 compute-1 python3.9[95097]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Nov 29 06:24:53 compute-1 sudo[95095]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:53.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Nov 29 06:24:54 compute-1 ceph-mon[80754]: 9.1e scrub starts
Nov 29 06:24:54 compute-1 ceph-mon[80754]: 9.1e scrub ok
Nov 29 06:24:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Nov 29 06:24:54 compute-1 ceph-mon[80754]: osdmap e122: 3 total, 3 up, 3 in
Nov 29 06:24:54 compute-1 ceph-mon[80754]: pgmap v356: 305 pgs: 1 unknown, 1 active+remapped, 1 peering, 302 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 14 B/s, 0 objects/s recovering
Nov 29 06:24:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:24:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:54.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:24:55 compute-1 sshd-session[92425]: Connection closed by 192.168.122.30 port 56162
Nov 29 06:24:55 compute-1 sshd-session[92422]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:24:55 compute-1 systemd[1]: session-35.scope: Deactivated successfully.
Nov 29 06:24:55 compute-1 systemd[1]: session-35.scope: Consumed 19.310s CPU time.
Nov 29 06:24:55 compute-1 systemd-logind[785]: Session 35 logged out. Waiting for processes to exit.
Nov 29 06:24:55 compute-1 systemd-logind[785]: Removed session 35.
Nov 29 06:24:55 compute-1 sshd-session[95122]: Received disconnect from 118.194.230.250 port 50576:11: Bye Bye [preauth]
Nov 29 06:24:55 compute-1 sshd-session[95122]: Disconnected from authenticating user root 118.194.230.250 port 50576 [preauth]
Nov 29 06:24:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:24:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:55.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:56 compute-1 ceph-mon[80754]: pgmap v357: 305 pgs: 1 peering, 1 unknown, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 13 B/s, 0 objects/s recovering
Nov 29 06:24:56 compute-1 ceph-mon[80754]: osdmap e123: 3 total, 3 up, 3 in
Nov 29 06:24:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:24:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:56.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:24:57 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.6 deep-scrub starts
Nov 29 06:24:57 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.6 deep-scrub ok
Nov 29 06:24:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:24:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:57.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:24:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:58.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:24:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Nov 29 06:24:59 compute-1 ceph-mon[80754]: pgmap v359: 305 pgs: 1 peering, 1 unknown, 303 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:24:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:24:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:24:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:59.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:00.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:01 compute-1 anacron[30895]: Job `cron.daily' started
Nov 29 06:25:01 compute-1 anacron[30895]: Job `cron.daily' terminated
Nov 29 06:25:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:01.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Nov 29 06:25:02 compute-1 ceph-mon[80754]: 9.6 deep-scrub starts
Nov 29 06:25:02 compute-1 ceph-mon[80754]: 9.6 deep-scrub ok
Nov 29 06:25:02 compute-1 ceph-mon[80754]: pgmap v360: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:02.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:02 compute-1 sshd-session[95127]: Accepted publickey for zuul from 192.168.122.30 port 52916 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:25:02 compute-1 systemd-logind[785]: New session 36 of user zuul.
Nov 29 06:25:02 compute-1 systemd[1]: Started Session 36 of User zuul.
Nov 29 06:25:02 compute-1 sshd-session[95127]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:25:03 compute-1 python3.9[95280]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:25:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:03.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:04.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:05 compute-1 python3.9[95434]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:25:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:05.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:06 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.a scrub starts
Nov 29 06:25:06 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.a scrub ok
Nov 29 06:25:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Nov 29 06:25:06 compute-1 ceph-mon[80754]: 9.1a scrub starts
Nov 29 06:25:06 compute-1 ceph-mon[80754]: 9.1a scrub ok
Nov 29 06:25:06 compute-1 ceph-mon[80754]: pgmap v362: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 13 B/s, 0 objects/s recovering
Nov 29 06:25:06 compute-1 ceph-mon[80754]: osdmap e124: 3 total, 3 up, 3 in
Nov 29 06:25:06 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Nov 29 06:25:06 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Nov 29 06:25:06 compute-1 ceph-mon[80754]: osdmap e125: 3 total, 3 up, 3 in
Nov 29 06:25:06 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Nov 29 06:25:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:06.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:06 compute-1 python3.9[95629]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:25:07 compute-1 sshd-session[95566]: Invalid user jason from 66.94.122.234 port 52364
Nov 29 06:25:07 compute-1 sshd-session[95130]: Connection closed by 192.168.122.30 port 52916
Nov 29 06:25:07 compute-1 sshd-session[95127]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:25:07 compute-1 systemd[1]: session-36.scope: Deactivated successfully.
Nov 29 06:25:07 compute-1 systemd[1]: session-36.scope: Consumed 2.635s CPU time.
Nov 29 06:25:07 compute-1 systemd-logind[785]: Session 36 logged out. Waiting for processes to exit.
Nov 29 06:25:07 compute-1 systemd-logind[785]: Removed session 36.
Nov 29 06:25:07 compute-1 sshd-session[95566]: Received disconnect from 66.94.122.234 port 52364:11: Bye Bye [preauth]
Nov 29 06:25:07 compute-1 sshd-session[95566]: Disconnected from invalid user jason 66.94.122.234 port 52364 [preauth]
Nov 29 06:25:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:07.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:08 compute-1 sudo[95655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:08 compute-1 sudo[95655]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:08 compute-1 sudo[95655]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:08 compute-1 sudo[95680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:25:08 compute-1 sudo[95680]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:08 compute-1 sudo[95680]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:08 compute-1 sudo[95705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:08 compute-1 sudo[95705]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:08 compute-1 sudo[95705]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:08 compute-1 sudo[95730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:25:08 compute-1 sudo[95730]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:08.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:08 compute-1 ceph-mon[80754]: pgmap v363: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail; 13 B/s, 0 objects/s recovering
Nov 29 06:25:08 compute-1 ceph-mon[80754]: pgmap v365: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:08 compute-1 ceph-mon[80754]: pgmap v366: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:08 compute-1 ceph-mon[80754]: 9.a scrub starts
Nov 29 06:25:08 compute-1 ceph-mon[80754]: 9.a scrub ok
Nov 29 06:25:08 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Nov 29 06:25:08 compute-1 ceph-mon[80754]: osdmap e126: 3 total, 3 up, 3 in
Nov 29 06:25:09 compute-1 podman[95826]: 2025-11-29 06:25:09.003610094 +0000 UTC m=+0.082734655 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Nov 29 06:25:09 compute-1 podman[95826]: 2025-11-29 06:25:09.112967198 +0000 UTC m=+0.192091809 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Nov 29 06:25:09 compute-1 sudo[95730]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:09.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:10.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:11.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:12.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:13.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Nov 29 06:25:14 compute-1 ceph-mon[80754]: pgmap v368: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 06:25:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:14.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:15 compute-1 sshd-session[95946]: Accepted publickey for zuul from 192.168.122.30 port 55758 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:25:15 compute-1 systemd-logind[785]: New session 37 of user zuul.
Nov 29 06:25:15 compute-1 systemd[1]: Started Session 37 of User zuul.
Nov 29 06:25:15 compute-1 sshd-session[95946]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:25:15 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:15 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 127 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=127 pruub=13.503620148s) [2] r=-1 lpr=127 pi=[93,127)/1 crt=56'1130 mlcod 0'0 active pruub 413.242828369s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:25:15 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 127 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=127 pruub=13.503321648s) [2] r=-1 lpr=127 pi=[93,127)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 413.242828369s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:25:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:25:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:16.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:25:16 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.d scrub starts
Nov 29 06:25:16 compute-1 python3.9[96099]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:25:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Nov 29 06:25:16 compute-1 ceph-mon[80754]: pgmap v369: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 06:25:16 compute-1 ceph-mon[80754]: pgmap v370: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 06:25:16 compute-1 ceph-mon[80754]: pgmap v371: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 06:25:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 06:25:16 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:16 compute-1 ceph-mon[80754]: osdmap e127: 3 total, 3 up, 3 in
Nov 29 06:25:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:16.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:16 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.d scrub ok
Nov 29 06:25:17 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.f scrub starts
Nov 29 06:25:17 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.f scrub ok
Nov 29 06:25:17 compute-1 python3.9[96253]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:25:17 compute-1 sudo[96254]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:17 compute-1 sudo[96254]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:17 compute-1 sudo[96254]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:17 compute-1 sudo[96283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:25:17 compute-1 sudo[96283]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:17 compute-1 sudo[96283]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:17 compute-1 sudo[96308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:17 compute-1 sudo[96308]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:17 compute-1 sudo[96308]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:17 compute-1 sudo[96333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:25:17 compute-1 sudo[96333]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:18.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:18 compute-1 sudo[96333]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:18 compute-1 sudo[96535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzktralyfoacnchahqrdtxbppvoictxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397517.994987-86-21998069042053/AnsiballZ_setup.py'
Nov 29 06:25:18 compute-1 sudo[96535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:18.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:18 compute-1 python3.9[96537]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:25:19 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Nov 29 06:25:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 06:25:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 06:25:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 06:25:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:19 compute-1 ceph-mon[80754]: osdmap e128: 3 total, 3 up, 3 in
Nov 29 06:25:19 compute-1 ceph-mon[80754]: pgmap v374: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Nov 29 06:25:19 compute-1 sudo[96535]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:19 compute-1 sudo[96619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygsfplmeidlcywcbgykvpbmnyfcovitg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397517.994987-86-21998069042053/AnsiballZ_dnf.py'
Nov 29 06:25:19 compute-1 sudo[96619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:19 compute-1 python3.9[96621]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:25:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:25:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:20.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:25:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:20.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:21 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 129 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=0 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:25:21 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 129 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=0 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:25:21 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 129 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=129 pruub=9.221545219s) [1] r=-1 lpr=129 pi=[78,129)/1 crt=56'1130 mlcod 0'0 active pruub 414.086791992s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:25:21 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 129 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=129 pruub=9.221508980s) [1] r=-1 lpr=129 pi=[78,129)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 414.086791992s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:25:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:22.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:22 compute-1 ceph-mon[80754]: 9.d scrub starts
Nov 29 06:25:22 compute-1 ceph-mon[80754]: 9.d scrub ok
Nov 29 06:25:22 compute-1 ceph-mon[80754]: 9.f scrub starts
Nov 29 06:25:22 compute-1 ceph-mon[80754]: 9.f scrub ok
Nov 29 06:25:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:22 compute-1 ceph-mon[80754]: pgmap v375: 305 pgs: 305 active+clean; 456 KiB data, 144 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Nov 29 06:25:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Nov 29 06:25:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:22 compute-1 ceph-mon[80754]: osdmap e129: 3 total, 3 up, 3 in
Nov 29 06:25:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:22 compute-1 sudo[96619]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:22 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Nov 29 06:25:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:25:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:22.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:25:23 compute-1 sudo[96772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giwlcxzlhombpgrqzvpawhvrzmafahhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397522.6776342-122-177294543917812/AnsiballZ_setup.py'
Nov 29 06:25:23 compute-1 sudo[96772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:23 compute-1 python3.9[96774]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:25:23 compute-1 sudo[96772]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:23 compute-1 ceph-mon[80754]: pgmap v377: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 145 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:25:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:25:23 compute-1 ceph-mon[80754]: pgmap v378: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 145 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Nov 29 06:25:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:23 compute-1 ceph-mon[80754]: osdmap e130: 3 total, 3 up, 3 in
Nov 29 06:25:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:25:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:25:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:25:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:25:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:24.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:25:24 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Nov 29 06:25:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 131 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=131) [1]/[0] r=0 lpr=131 pi=[78,131)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:25:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 131 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=131) [1]/[0] r=0 lpr=131 pi=[78,131)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:25:24 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 131 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] async=[2] r=0 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:25:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:24.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:25 compute-1 sudo[96967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elrzvcbvrorbwjruuvhvhfddlgadacsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397524.4440372-155-189589261523382/AnsiballZ_file.py'
Nov 29 06:25:25 compute-1 sudo[96967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Nov 29 06:25:25 compute-1 ceph-mon[80754]: osdmap e131: 3 total, 3 up, 3 in
Nov 29 06:25:25 compute-1 ceph-mon[80754]: pgmap v381: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 145 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:25 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132 pruub=15.171678543s) [2] async=[2] r=-1 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 56'1130 active pruub 424.108947754s@ mbc={255={}}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:25:25 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132 pruub=15.171549797s) [2] r=-1 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 424.108947754s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:25:25 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 132 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=131/132 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=131) [1]/[0] async=[1] r=0 lpr=131 pi=[78,131)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:25:25 compute-1 python3.9[96969]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:25 compute-1 sudo[96967]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:26 compute-1 sudo[97119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvjmbrfzgoohacjmxwvydbyiynkmdsqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397525.5301178-179-49694932020337/AnsiballZ_command.py'
Nov 29 06:25:26 compute-1 sudo[97119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:25:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:26.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:25:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:26 compute-1 python3.9[97121]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:25:26 compute-1 sudo[97119]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Nov 29 06:25:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 133 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133 pruub=14.523239136s) [1] async=[1] r=-1 lpr=133 pi=[78,133)/1 crt=56'1130 mlcod 56'1130 active pruub 424.948211670s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:25:26 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 133 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133 pruub=14.522854805s) [1] r=-1 lpr=133 pi=[78,133)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 424.948211670s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:25:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:26.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:26 compute-1 ceph-mon[80754]: osdmap e132: 3 total, 3 up, 3 in
Nov 29 06:25:26 compute-1 sudo[97282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nquzcloerebzjsmajxlgaroihqfgykvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397526.4862478-203-40777329085197/AnsiballZ_stat.py'
Nov 29 06:25:26 compute-1 sudo[97282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:27 compute-1 python3.9[97284]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:25:27 compute-1 sudo[97282]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:27 compute-1 sudo[97360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fahgdvezrsvmbzhjddsfsqryywtbdkog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397526.4862478-203-40777329085197/AnsiballZ_file.py'
Nov 29 06:25:27 compute-1 sudo[97360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:27 compute-1 python3.9[97362]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:27 compute-1 sudo[97360]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Nov 29 06:25:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:28.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:28 compute-1 sudo[97512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjrxeniqewfdmjuojzmozcsdmqjwiumb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397527.922829-239-208928359071028/AnsiballZ_stat.py'
Nov 29 06:25:28 compute-1 sudo[97512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:28 compute-1 ceph-mon[80754]: pgmap v383: 305 pgs: 2 unknown, 303 active+clean; 456 KiB data, 145 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:28 compute-1 ceph-mon[80754]: osdmap e133: 3 total, 3 up, 3 in
Nov 29 06:25:28 compute-1 python3.9[97514]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:25:28 compute-1 sudo[97512]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:28.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:28 compute-1 sudo[97590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpqwvkuhwwpauvggtjsvozejtjrlesjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397527.922829-239-208928359071028/AnsiballZ_file.py'
Nov 29 06:25:28 compute-1 sudo[97590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:28 compute-1 python3.9[97592]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:25:28 compute-1 sudo[97590]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:29 compute-1 sudo[97742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybsjewqqzzlimsqtgolzrcvgupnrwbpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397529.3041916-278-277211970926146/AnsiballZ_ini_file.py'
Nov 29 06:25:29 compute-1 sudo[97742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:30.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:30 compute-1 python3.9[97744]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:25:30 compute-1 sudo[97742]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:30 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.1f deep-scrub starts
Nov 29 06:25:30 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.1f deep-scrub ok
Nov 29 06:25:30 compute-1 sudo[97894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvitgbavevaqhjcrfcwulpognukgrsbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397530.2933545-278-183263038731183/AnsiballZ_ini_file.py'
Nov 29 06:25:30 compute-1 sudo[97894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:30.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:30 compute-1 python3.9[97896]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:25:30 compute-1 sudo[97894]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:31 compute-1 sudo[98046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmahbmunywkadkudehcsqcowaxybgsxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397531.049215-278-12010495533436/AnsiballZ_ini_file.py'
Nov 29 06:25:31 compute-1 sudo[98046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:31 compute-1 python3.9[98048]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:25:31 compute-1 sudo[98046]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:25:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:32.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:25:32 compute-1 sudo[98198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hntclyasremuaoxxwatrrwgtuaukfsbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397531.7488983-278-251380978728289/AnsiballZ_ini_file.py'
Nov 29 06:25:32 compute-1 sudo[98198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:32 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.10 scrub starts
Nov 29 06:25:32 compute-1 python3.9[98200]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:25:32 compute-1 sudo[98198]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:32.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:33 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.10 scrub ok
Nov 29 06:25:33 compute-1 sudo[98350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qafmmwrvnkaeugwkooitxhlkqrupumfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397532.7631376-371-74877647411105/AnsiballZ_dnf.py'
Nov 29 06:25:33 compute-1 sudo[98350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:33 compute-1 python3.9[98352]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:25:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:34.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:34.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:35 compute-1 ceph-mon[80754]: osdmap e134: 3 total, 3 up, 3 in
Nov 29 06:25:35 compute-1 ceph-mon[80754]: pgmap v386: 305 pgs: 1 peering, 1 unknown, 303 active+clean; 456 KiB data, 145 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:35 compute-1 sudo[98350]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:36.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 06:25:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:36.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:36 compute-1 sudo[98503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gntydxaiksidqazfjnmjndoqsdglzsfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397536.4771996-404-76465779889956/AnsiballZ_setup.py'
Nov 29 06:25:36 compute-1 sudo[98503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:37 compute-1 python3.9[98505]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:25:37 compute-1 sudo[98503]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:37 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.11 scrub starts
Nov 29 06:25:37 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.11 scrub ok
Nov 29 06:25:37 compute-1 sudo[98657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyklsxbiqwlmwgqqkeupazzhemiimrtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397537.3972614-428-195257822552306/AnsiballZ_stat.py'
Nov 29 06:25:37 compute-1 sudo[98657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:37 compute-1 python3.9[98659]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:25:37 compute-1 sudo[98657]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:38.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:25:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:38.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:25:38 compute-1 sudo[98809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkxzshporctryxfqwjhmylrbowxamart ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397538.3591022-455-158526531766684/AnsiballZ_stat.py'
Nov 29 06:25:38 compute-1 sudo[98809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:38 compute-1 python3.9[98811]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:25:38 compute-1 sudo[98809]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:39 compute-1 sudo[98961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rceqgwcnyfoytwzlobqyekfmjyepavug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397539.2079465-485-142978227802320/AnsiballZ_command.py'
Nov 29 06:25:39 compute-1 sudo[98961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:39 compute-1 python3.9[98963]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:25:39 compute-1 sudo[98961]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:40.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:40 compute-1 sudo[99114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-migcgwcsqgzlwiurcoxmdusvdjrcqeai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397540.081837-515-59304427040085/AnsiballZ_service_facts.py'
Nov 29 06:25:40 compute-1 sudo[99114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:40.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:40 compute-1 python3.9[99116]: ansible-service_facts Invoked
Nov 29 06:25:40 compute-1 network[99133]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:25:40 compute-1 network[99134]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:25:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Nov 29 06:25:40 compute-1 network[99135]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:25:41 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.898561478s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 active pruub 437.762023926s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:25:41 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:25:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:25:41 compute-1 ceph-mon[80754]: pgmap v387: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 8.3 KiB/s rd, 170 B/s wr, 15 op/s; 109 B/s, 2 objects/s recovering
Nov 29 06:25:41 compute-1 ceph-mon[80754]: 9.1f deep-scrub starts
Nov 29 06:25:41 compute-1 ceph-mon[80754]: 9.1f deep-scrub ok
Nov 29 06:25:41 compute-1 ceph-mon[80754]: pgmap v388: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 62 B/s, 0 objects/s recovering
Nov 29 06:25:41 compute-1 ceph-mon[80754]: 9.10 scrub starts
Nov 29 06:25:41 compute-1 ceph-mon[80754]: 9.1d scrub starts
Nov 29 06:25:41 compute-1 ceph-mon[80754]: 9.1d scrub ok
Nov 29 06:25:41 compute-1 ceph-mon[80754]: 9.10 scrub ok
Nov 29 06:25:41 compute-1 ceph-mon[80754]: pgmap v389: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:41 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:25:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:42.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:42.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:44.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:44.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:45 compute-1 sudo[99114]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:46.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:25:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Nov 29 06:25:46 compute-1 ceph-mon[80754]: pgmap v390: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:25:46 compute-1 ceph-mon[80754]: 9.11 scrub starts
Nov 29 06:25:46 compute-1 ceph-mon[80754]: 9.11 scrub ok
Nov 29 06:25:46 compute-1 ceph-mon[80754]: pgmap v391: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:46 compute-1 ceph-mon[80754]: pgmap v392: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:25:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:25:46 compute-1 ceph-mon[80754]: osdmap e135: 3 total, 3 up, 3 in
Nov 29 06:25:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 06:25:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:46.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:46 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:25:46 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:25:47 compute-1 sudo[99418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yskufpdempiqtmdbkdxkvacsgiagbwsk ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764397546.7329378-560-96070085807657/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764397546.7329378-560-96070085807657/args'
Nov 29 06:25:47 compute-1 sudo[99418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:47 compute-1 sudo[99418]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:48 compute-1 sudo[99585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgaldctqptzfjwjsxvcgtundpbgzierz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397547.6627457-593-242977821729301/AnsiballZ_dnf.py'
Nov 29 06:25:48 compute-1 sudo[99585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:25:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:48.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:25:48 compute-1 python3.9[99587]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:25:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:48.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:49 compute-1 sudo[99585]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:50 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:25:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:50.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:50.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:25:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Nov 29 06:25:51 compute-1 ceph-mon[80754]: pgmap v394: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:51 compute-1 ceph-mon[80754]: pgmap v395: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:25:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:25:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 06:25:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:52.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:52 compute-1 sudo[99738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llwljglrwqwtkzjxrwtjevpwolzetfhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397551.706877-632-225140180504751/AnsiballZ_package_facts.py'
Nov 29 06:25:52 compute-1 sudo[99738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:52 compute-1 python3.9[99740]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 29 06:25:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:52.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:52 compute-1 sudo[99738]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:53 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:25:53 compute-1 ceph-mon[80754]: pgmap v396: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:53 compute-1 ceph-mon[80754]: osdmap e136: 3 total, 3 up, 3 in
Nov 29 06:25:53 compute-1 ceph-mon[80754]: pgmap v398: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:53 compute-1 ceph-mon[80754]: pgmap v399: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:53 compute-1 ceph-mon[80754]: osdmap e137: 3 total, 3 up, 3 in
Nov 29 06:25:53 compute-1 ceph-mon[80754]: pgmap v401: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:25:53 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:54 compute-1 sudo[99890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gawinarbjlrbsfcxamuswtbfuzpvbgba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397553.6755166-662-39607316094704/AnsiballZ_stat.py'
Nov 29 06:25:54 compute-1 sudo[99890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:54.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:54 compute-1 python3.9[99892]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:25:54 compute-1 sudo[99890]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:54 compute-1 sudo[99895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:25:54 compute-1 sudo[99895]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:54 compute-1 sudo[99895]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:54 compute-1 sudo[99943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:25:54 compute-1 sudo[99943]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:25:54 compute-1 sudo[99943]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:54 compute-1 sudo[100018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbfzysqfhxzxoewsuusqgizlvvcmygmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397553.6755166-662-39607316094704/AnsiballZ_file.py'
Nov 29 06:25:54 compute-1 sudo[100018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:25:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:54.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:25:54 compute-1 python3.9[100020]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:54 compute-1 sudo[100018]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:55 compute-1 sudo[100170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaxsbzlvcbwivyslpyvcpsytxkcmazja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397555.06213-699-139785197950952/AnsiballZ_stat.py'
Nov 29 06:25:55 compute-1 sudo[100170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:55 compute-1 python3.9[100172]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:25:55 compute-1 sudo[100170]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:56 compute-1 sudo[100248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiuamjmfwutdzlbmxomcjfbfkxsfwmhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397555.06213-699-139785197950952/AnsiballZ_file.py'
Nov 29 06:25:56 compute-1 sudo[100248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:56.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:25:56 compute-1 python3.9[100250]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:56 compute-1 sudo[100248]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:56.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:57 compute-1 sudo[100400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wznwrbzdrqwiubmxklawiqfxrjgaiiob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397557.2650223-753-206632318417704/AnsiballZ_lineinfile.py'
Nov 29 06:25:57 compute-1 sudo[100400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:57 compute-1 python3.9[100402]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:57 compute-1 sudo[100400]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:25:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:58.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:25:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:25:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:25:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:58.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:25:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Nov 29 06:25:59 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:25:59 compute-1 sudo[100552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpjdihwtneoiezjnecxwoxmysouywcrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397559.4426053-798-198516021651597/AnsiballZ_setup.py'
Nov 29 06:25:59 compute-1 sudo[100552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:00.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:00 compute-1 python3.9[100554]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:26:00 compute-1 sudo[100552]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:00.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:01 compute-1 sudo[100636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkygjvjaainuxqoexzjyotdiytsakqaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397559.4426053-798-198516021651597/AnsiballZ_systemd.py'
Nov 29 06:26:01 compute-1 sudo[100636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:01 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:01 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.835290909s) [1] async=[1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 56'1130 active pruub 461.014038086s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:26:01 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:26:01 compute-1 python3.9[100638]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:26:01 compute-1 sudo[100636]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:02.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:02.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:02 compute-1 sshd-session[95949]: Connection closed by 192.168.122.30 port 55758
Nov 29 06:26:02 compute-1 sshd-session[95946]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:26:02 compute-1 systemd[1]: session-37.scope: Deactivated successfully.
Nov 29 06:26:02 compute-1 systemd[1]: session-37.scope: Consumed 26.501s CPU time.
Nov 29 06:26:02 compute-1 systemd-logind[785]: Session 37 logged out. Waiting for processes to exit.
Nov 29 06:26:02 compute-1 systemd-logind[785]: Removed session 37.
Nov 29 06:26:02 compute-1 sshd-session[100665]: Received disconnect from 45.55.249.98 port 49314:11: Bye Bye [preauth]
Nov 29 06:26:02 compute-1 sshd-session[100665]: Disconnected from authenticating user root 45.55.249.98 port 49314 [preauth]
Nov 29 06:26:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:04.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 e139: 3 total, 3 up, 3 in
Nov 29 06:26:04 compute-1 ceph-mon[80754]: pgmap v402: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:04 compute-1 ceph-mon[80754]: pgmap v403: 305 pgs: 1 unknown, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:04 compute-1 ceph-mon[80754]: pgmap v404: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:26:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:04.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:06.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:06.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:08.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:08 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:08 compute-1 sshd-session[100667]: Received disconnect from 118.194.230.250 port 50674:11: Bye Bye [preauth]
Nov 29 06:26:08 compute-1 sshd-session[100667]: Disconnected from authenticating user root 118.194.230.250 port 50674 [preauth]
Nov 29 06:26:08 compute-1 sshd-session[100669]: Accepted publickey for zuul from 192.168.122.30 port 37518 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:26:08 compute-1 systemd-logind[785]: New session 38 of user zuul.
Nov 29 06:26:08 compute-1 systemd[1]: Started Session 38 of User zuul.
Nov 29 06:26:08 compute-1 sshd-session[100669]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:26:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:08.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:09 compute-1 sudo[100822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tugvqvjqcpfdmikxzrqqbcfdgutrepel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397568.898428-32-47906852184802/AnsiballZ_file.py'
Nov 29 06:26:09 compute-1 sudo[100822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:09 compute-1 ceph-mon[80754]: pgmap v406: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 12 B/s, 0 objects/s recovering
Nov 29 06:26:09 compute-1 ceph-mon[80754]: osdmap e138: 3 total, 3 up, 3 in
Nov 29 06:26:09 compute-1 ceph-mon[80754]: pgmap v407: 305 pgs: 1 active+remapped, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:26:09 compute-1 ceph-mon[80754]: pgmap v408: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:09 compute-1 ceph-mon[80754]: osdmap e139: 3 total, 3 up, 3 in
Nov 29 06:26:09 compute-1 python3.9[100824]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:09 compute-1 sudo[100822]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:10.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:10 compute-1 sudo[100974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcnldmqtjkehagpvxmcqwbjupltzjvty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397569.9360678-68-99136659421858/AnsiballZ_stat.py'
Nov 29 06:26:10 compute-1 sudo[100974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:10 compute-1 python3.9[100976]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:10 compute-1 sudo[100974]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:10.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:10 compute-1 ceph-mon[80754]: pgmap v410: 305 pgs: 1 peering, 304 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:10 compute-1 ceph-mon[80754]: pgmap v411: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:10 compute-1 sudo[101052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghbbqgnckwpiufufgiqtjykzcotaysna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397569.9360678-68-99136659421858/AnsiballZ_file.py'
Nov 29 06:26:10 compute-1 sudo[101052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:11 compute-1 python3.9[101054]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:11 compute-1 sudo[101052]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:11 compute-1 sshd-session[100672]: Connection closed by 192.168.122.30 port 37518
Nov 29 06:26:11 compute-1 sshd-session[100669]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:26:11 compute-1 systemd[1]: session-38.scope: Deactivated successfully.
Nov 29 06:26:11 compute-1 systemd[1]: session-38.scope: Consumed 1.842s CPU time.
Nov 29 06:26:11 compute-1 systemd-logind[785]: Session 38 logged out. Waiting for processes to exit.
Nov 29 06:26:11 compute-1 systemd-logind[785]: Removed session 38.
Nov 29 06:26:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:12.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:12.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:13 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:13 compute-1 ceph-mon[80754]: pgmap v412: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:14.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.669173) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574669319, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7845, "num_deletes": 255, "total_data_size": 16680430, "memory_usage": 16933152, "flush_reason": "Manual Compaction"}
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Nov 29 06:26:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:14.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574761591, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 10246072, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 257, "largest_seqno": 7850, "table_properties": {"data_size": 10212483, "index_size": 22402, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10117, "raw_key_size": 96609, "raw_average_key_size": 24, "raw_value_size": 10133446, "raw_average_value_size": 2520, "num_data_blocks": 982, "num_entries": 4021, "num_filter_entries": 4021, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 1764397161, "file_creation_time": 1764397574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 92525 microseconds, and 26134 cpu microseconds.
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.761699) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 10246072 bytes OK
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.761725) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.767871) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.767886) EVENT_LOG_v1 {"time_micros": 1764397574767882, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.767903) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 16637428, prev total WAL file size 16638063, number of live WAL files 2.
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.771616) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10005KB) 8(1648B)]
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574771839, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 10247720, "oldest_snapshot_seqno": -1}
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3770 keys, 10242579 bytes, temperature: kUnknown
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574864128, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 10242579, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10209657, "index_size": 22380, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9477, "raw_key_size": 92431, "raw_average_key_size": 24, "raw_value_size": 10133726, "raw_average_value_size": 2687, "num_data_blocks": 982, "num_entries": 3770, "num_filter_entries": 3770, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764397574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.864451) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 10242579 bytes
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.867268) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 110.9 rd, 110.9 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(9.8, 0.0 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 4026, records dropped: 256 output_compression: NoCompression
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.867307) EVENT_LOG_v1 {"time_micros": 1764397574867277, "job": 4, "event": "compaction_finished", "compaction_time_micros": 92377, "compaction_time_cpu_micros": 33826, "output_level": 6, "num_output_files": 1, "total_output_size": 10242579, "num_input_records": 4026, "num_output_records": 3770, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574869538, "job": 4, "event": "table_file_deletion", "file_number": 14}
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574869587, "job": 4, "event": "table_file_deletion", "file_number": 8}
Nov 29 06:26:14 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.771348) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:26:15 compute-1 ceph-mon[80754]: pgmap v413: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:16.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:16.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:17 compute-1 sshd-session[101082]: Accepted publickey for zuul from 192.168.122.30 port 41192 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:26:17 compute-1 systemd-logind[785]: New session 39 of user zuul.
Nov 29 06:26:17 compute-1 systemd[1]: Started Session 39 of User zuul.
Nov 29 06:26:17 compute-1 sshd-session[101082]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:26:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:18.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:18 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:18 compute-1 ceph-mon[80754]: pgmap v414: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 10 B/s, 0 objects/s recovering
Nov 29 06:26:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:18.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:18 compute-1 python3.9[101235]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:26:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:26:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:20.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:26:20 compute-1 sudo[101389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evomwznhjgewuhwpiaefwltqxcgeoiuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397579.9000187-65-60469603378234/AnsiballZ_file.py'
Nov 29 06:26:20 compute-1 sudo[101389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:20 compute-1 python3.9[101391]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:20 compute-1 sudo[101389]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:20.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:20 compute-1 sshd-session[101080]: Connection closed by 119.45.242.7 port 54114 [preauth]
Nov 29 06:26:21 compute-1 ceph-mon[80754]: pgmap v415: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 9 B/s, 0 objects/s recovering
Nov 29 06:26:21 compute-1 ceph-mon[80754]: pgmap v416: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 9 B/s, 0 objects/s recovering
Nov 29 06:26:21 compute-1 sudo[101565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqqjcbqtsqbztwdupelwovaneqoepquz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397580.9807463-89-32341433243183/AnsiballZ_stat.py'
Nov 29 06:26:21 compute-1 sudo[101565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:22 compute-1 python3.9[101567]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:22 compute-1 sudo[101565]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:22.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:22 compute-1 sudo[101643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlmnlvfisrppadklubkbrinjwbubyyag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397580.9807463-89-32341433243183/AnsiballZ_file.py'
Nov 29 06:26:22 compute-1 sudo[101643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:22 compute-1 python3.9[101645]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.o1yfqgms recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:22 compute-1 sudo[101643]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:22 compute-1 ceph-mon[80754]: pgmap v417: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:22.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:23 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:23 compute-1 sudo[101795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcvjvkegktmtcbhhjflrrcwnyfocpncc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397583.2077258-149-133773144923069/AnsiballZ_stat.py'
Nov 29 06:26:23 compute-1 sudo[101795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:23 compute-1 python3.9[101797]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:23 compute-1 sudo[101795]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:23 compute-1 sudo[101873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxoogsrgeonzqtfschlohteehuxwqkun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397583.2077258-149-133773144923069/AnsiballZ_file.py'
Nov 29 06:26:23 compute-1 sudo[101873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:24.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:24 compute-1 ceph-mon[80754]: pgmap v418: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:24 compute-1 python3.9[101875]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.ybw6pfgn recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:24 compute-1 sudo[101873]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:24.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:24 compute-1 sudo[102025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qldqrslevhptxpvvsudvdxcbfxqddtfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397584.509634-188-7554940455261/AnsiballZ_file.py'
Nov 29 06:26:24 compute-1 sudo[102025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:24 compute-1 python3.9[102027]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:26:25 compute-1 sudo[102025]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:25 compute-1 sudo[102177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykqzasycscjulliqdnvmvbrhgqdpegmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397585.5351913-212-85827803665909/AnsiballZ_stat.py'
Nov 29 06:26:25 compute-1 sudo[102177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:26 compute-1 python3.9[102179]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:26 compute-1 sudo[102177]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:26.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:26 compute-1 sudo[102255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvhbpeiyxsuztebwpjznjrgumkgrrqbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397585.5351913-212-85827803665909/AnsiballZ_file.py'
Nov 29 06:26:26 compute-1 sudo[102255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:26 compute-1 python3.9[102257]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:26:26 compute-1 sudo[102255]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:26.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:27 compute-1 sudo[102407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkujwtspcojosyxgfudtkzdvvkhzlpvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397586.7545826-212-174970034543826/AnsiballZ_stat.py'
Nov 29 06:26:27 compute-1 sudo[102407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:27 compute-1 python3.9[102409]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:27 compute-1 sudo[102407]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:27 compute-1 sudo[102485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qugqiawntrlclzwqougogpgsymknokms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397586.7545826-212-174970034543826/AnsiballZ_file.py'
Nov 29 06:26:27 compute-1 sudo[102485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:27 compute-1 python3.9[102487]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:26:27 compute-1 sudo[102485]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:28.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:28 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:28 compute-1 sudo[102637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxrthczpsdhaihhdcxvdnuqzdgqfinvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397588.0709772-281-151201786525722/AnsiballZ_file.py'
Nov 29 06:26:28 compute-1 sudo[102637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:28 compute-1 python3.9[102639]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:28 compute-1 sudo[102637]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:28.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:29 compute-1 sudo[102789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoialssankccybkzrkwnmdwolekjhdoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397588.9317648-305-1638067791368/AnsiballZ_stat.py'
Nov 29 06:26:29 compute-1 sudo[102789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:29 compute-1 python3.9[102791]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:29 compute-1 sudo[102789]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:29 compute-1 sudo[102867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbddfrghkolsnfdnbotrjosjxauqasex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397588.9317648-305-1638067791368/AnsiballZ_file.py'
Nov 29 06:26:29 compute-1 sudo[102867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:30 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:26:30 compute-1 python3.9[102869]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:30 compute-1 sudo[102867]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:30.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:30 compute-1 sudo[103019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avizrhtcntbageqcyidpxdewdxfzdxeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397590.284916-341-133549634465085/AnsiballZ_stat.py'
Nov 29 06:26:30 compute-1 sudo[103019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:30.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:30 compute-1 python3.9[103021]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:30 compute-1 sudo[103019]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:31 compute-1 ceph-mon[80754]: pgmap v419: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:31 compute-1 sudo[103097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzxbocbrpioopxysdomaaxijnlzhydet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397590.284916-341-133549634465085/AnsiballZ_file.py'
Nov 29 06:26:31 compute-1 sudo[103097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:31 compute-1 python3.9[103099]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:31 compute-1 sudo[103097]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:32 compute-1 sudo[103249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-molgdqzytfupbzondfwldmmfeniipjdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397591.4978516-377-130483277722137/AnsiballZ_systemd.py'
Nov 29 06:26:32 compute-1 sudo[103249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:32.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:32 compute-1 python3.9[103251]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:26:32 compute-1 systemd[1]: Reloading.
Nov 29 06:26:32 compute-1 systemd-rc-local-generator[103275]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:26:32 compute-1 systemd-sysv-generator[103282]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:26:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:32.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:32 compute-1 sudo[103249]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:33 compute-1 ceph-mon[80754]: pgmap v420: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:33 compute-1 ceph-mon[80754]: pgmap v421: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:33 compute-1 ceph-mon[80754]: pgmap v422: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:33 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:33 compute-1 sudo[103438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuxwcpnuiuqnlilauxmfkiesjhrdapnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397593.1183496-401-216694956095139/AnsiballZ_stat.py'
Nov 29 06:26:33 compute-1 sudo[103438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:33 compute-1 python3.9[103440]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:33 compute-1 sudo[103438]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:34 compute-1 sudo[103516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmxunqbqcyvawztlmhvnccjkocyczddm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397593.1183496-401-216694956095139/AnsiballZ_file.py'
Nov 29 06:26:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:34 compute-1 sudo[103516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:34.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:34 compute-1 python3.9[103518]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:34 compute-1 sudo[103516]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:34.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:35 compute-1 sudo[103668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yooilndtqdzwuynvjxbulqzcibmjulum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397594.6736398-437-15609471838043/AnsiballZ_stat.py'
Nov 29 06:26:35 compute-1 sudo[103668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:35 compute-1 python3.9[103670]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:35 compute-1 sudo[103668]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:35 compute-1 sudo[103746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwnpcsosbafnnaqatislbjnmjrbiyhvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397594.6736398-437-15609471838043/AnsiballZ_file.py'
Nov 29 06:26:35 compute-1 sudo[103746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:35 compute-1 python3.9[103748]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:35 compute-1 sudo[103746]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:36.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:36 compute-1 sudo[103898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyestkafpmumryqslovchjfizmupxizf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397595.9384375-473-241558616099428/AnsiballZ_systemd.py'
Nov 29 06:26:36 compute-1 sudo[103898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:36 compute-1 python3.9[103900]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:26:36 compute-1 systemd[1]: Reloading.
Nov 29 06:26:36 compute-1 ceph-mon[80754]: pgmap v423: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:36 compute-1 systemd-sysv-generator[103932]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:26:36 compute-1 systemd-rc-local-generator[103927]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:26:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:36.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:36 compute-1 systemd[1]: Starting Create netns directory...
Nov 29 06:26:36 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:26:36 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:26:36 compute-1 systemd[1]: Finished Create netns directory.
Nov 29 06:26:37 compute-1 sudo[103898]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:37 compute-1 ceph-mon[80754]: pgmap v424: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:37 compute-1 ceph-mon[80754]: pgmap v425: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:37 compute-1 python3.9[104093]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:26:37 compute-1 network[104110]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:26:37 compute-1 network[104111]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:26:37 compute-1 network[104112]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:26:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:38.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:38 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:38.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:39 compute-1 ceph-mon[80754]: pgmap v426: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:40.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:40.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:26:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:42.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:26:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:42.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:43 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:44.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:44 compute-1 sudo[104372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptppuvisxjbbdylqzwmzqwtxrgqykfvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397604.1396194-551-133896433065801/AnsiballZ_stat.py'
Nov 29 06:26:44 compute-1 sudo[104372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:44 compute-1 python3.9[104374]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:44 compute-1 sudo[104372]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:44.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:44 compute-1 sudo[104450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsmbruxbyqmnjgtoodxveujjqmtzgxau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397604.1396194-551-133896433065801/AnsiballZ_file.py'
Nov 29 06:26:44 compute-1 sudo[104450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:45 compute-1 python3.9[104452]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:45 compute-1 sudo[104450]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:46 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:26:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:26:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:46.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:26:46 compute-1 sshd-session[104477]: Invalid user service from 66.94.122.234 port 52968
Nov 29 06:26:46 compute-1 sshd-session[104477]: Received disconnect from 66.94.122.234 port 52968:11: Bye Bye [preauth]
Nov 29 06:26:46 compute-1 sshd-session[104477]: Disconnected from invalid user service 66.94.122.234 port 52968 [preauth]
Nov 29 06:26:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:46.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:48.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:48 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:48.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:50 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:26:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:50.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:50 compute-1 sudo[104606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vokxarpgyjbsexdkuozfjebbvwyuklzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397610.178584-590-18128381006058/AnsiballZ_file.py'
Nov 29 06:26:50 compute-1 sudo[104606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:50 compute-1 python3.9[104608]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:50 compute-1 sudo[104606]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:26:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:50.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:26:50 compute-1 sshd-session[104567]: Invalid user actions from 71.70.164.48 port 47688
Nov 29 06:26:50 compute-1 sshd-session[104567]: Received disconnect from 71.70.164.48 port 47688:11: Bye Bye [preauth]
Nov 29 06:26:50 compute-1 sshd-session[104567]: Disconnected from invalid user actions 71.70.164.48 port 47688 [preauth]
Nov 29 06:26:51 compute-1 sudo[104758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubrdgkxkdzjnraiaurbrwbnrimnzjowp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397611.012943-614-42780647073003/AnsiballZ_stat.py'
Nov 29 06:26:51 compute-1 sudo[104758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:51 compute-1 python3.9[104760]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).paxos(paxos updating c 252..773) lease_timeout -- calling new election
Nov 29 06:26:51 compute-1 ceph-mon[80754]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Nov 29 06:26:51 compute-1 ceph-mon[80754]: paxos.2).electionLogic(14) init, last seen epoch 14
Nov 29 06:26:51 compute-1 sudo[104758]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:51 compute-1 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:26:51 compute-1 sudo[104836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcirexmpgdcscuuhgrqggcjbfmiukinm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397611.012943-614-42780647073003/AnsiballZ_file.py'
Nov 29 06:26:51 compute-1 sudo[104836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:52 compute-1 python3.9[104838]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:52 compute-1 sudo[104836]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:52.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:52 compute-1 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:26:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:52.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:53 compute-1 sudo[104988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnthduxwkthhebxxoqytcgeyjaayskmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397612.690614-659-194653880939788/AnsiballZ_timezone.py'
Nov 29 06:26:53 compute-1 sudo[104988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:53 compute-1 ceph-mon[80754]: pgmap v427: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:53 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:26:53 compute-1 python3.9[104990]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 06:26:53 compute-1 systemd[1]: Starting Time & Date Service...
Nov 29 06:26:53 compute-1 systemd[1]: Started Time & Date Service.
Nov 29 06:26:53 compute-1 sudo[104988]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:53 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:54.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:54 compute-1 sudo[105144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbhgwgtonqxpbteivzsuqkwqxirvvcwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397613.9705913-686-158276715816699/AnsiballZ_file.py'
Nov 29 06:26:54 compute-1 sudo[105144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:54 compute-1 python3.9[105146]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:54 compute-1 sudo[105144]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:54 compute-1 sudo[105148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:54 compute-1 sudo[105148]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:54 compute-1 sudo[105148]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:54 compute-1 sudo[105196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:26:54 compute-1 sudo[105196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:54 compute-1 sudo[105196]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:54 compute-1 sudo[105222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:26:54 compute-1 sudo[105222]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:54 compute-1 sudo[105222]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:54.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:54 compute-1 sudo[105277]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:26:54 compute-1 sudo[105277]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:26:54 compute-1 ceph-mon[80754]: pgmap v428: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-1 ceph-mon[80754]: pgmap v429: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-1 ceph-mon[80754]: pgmap v430: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-1 ceph-mon[80754]: pgmap v431: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-1 ceph-mon[80754]: pgmap v432: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-1 ceph-mon[80754]: mon.compute-1 calling monitor election
Nov 29 06:26:54 compute-1 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 06:26:54 compute-1 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 06:26:54 compute-1 ceph-mon[80754]: pgmap v433: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:54 compute-1 ceph-mon[80754]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 06:26:54 compute-1 ceph-mon[80754]: monmap e3: 3 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],compute-1=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],compute-2=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]} removed_ranks: {} disallowed_leaders: {}
Nov 29 06:26:54 compute-1 ceph-mon[80754]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:26:54 compute-1 ceph-mon[80754]: osdmap e139: 3 total, 3 up, 3 in
Nov 29 06:26:54 compute-1 ceph-mon[80754]: mgrmap e10: compute-0.vxabpq(active, since 9m), standbys: compute-2.ngsyhe, compute-1.gaxpay
Nov 29 06:26:54 compute-1 ceph-mon[80754]: overall HEALTH_OK
Nov 29 06:26:55 compute-1 sudo[105415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdpelcwujcmpvuhuoplzzkkqtksbqhkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397614.7537243-710-234940107914027/AnsiballZ_stat.py'
Nov 29 06:26:55 compute-1 sudo[105415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:55 compute-1 python3.9[105423]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:55 compute-1 sudo[105415]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:55 compute-1 sudo[105559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzetbltjvbcszxyrhpgmxmqghpgbqdbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397614.7537243-710-234940107914027/AnsiballZ_file.py'
Nov 29 06:26:55 compute-1 sudo[105559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:55 compute-1 podman[105472]: 2025-11-29 06:26:55.620188535 +0000 UTC m=+0.309277261 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:26:55 compute-1 podman[105472]: 2025-11-29 06:26:55.734051917 +0000 UTC m=+0.423140603 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Nov 29 06:26:55 compute-1 python3.9[105561]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:55 compute-1 sudo[105559]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:56 compute-1 ceph-mon[80754]: pgmap v434: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:56.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:56 compute-1 sudo[105748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqdwkpdwlwoqcwczwceyyefsbshyqfob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397615.985348-746-151487221626430/AnsiballZ_stat.py'
Nov 29 06:26:56 compute-1 sudo[105748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:56 compute-1 python3.9[105752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:56 compute-1 sudo[105748]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:56 compute-1 sudo[105277]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:56 compute-1 sudo[105898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cngdnbbmpyomiemoihwvuyjsroxbavpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397615.985348-746-151487221626430/AnsiballZ_file.py'
Nov 29 06:26:56 compute-1 sudo[105898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:56.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:56 compute-1 python3.9[105900]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.o5ocs_xp recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:56 compute-1 sudo[105898]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:57 compute-1 sudo[106050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epppuxutqoegdtygpgnfqpfjyomxptug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397617.1628804-782-276220178122801/AnsiballZ_stat.py'
Nov 29 06:26:57 compute-1 sudo[106050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:57 compute-1 python3.9[106052]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:57 compute-1 sudo[106050]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:58 compute-1 sudo[106128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvyvgtnyskkonkqjystwhdsxaflrdpzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397617.1628804-782-276220178122801/AnsiballZ_file.py'
Nov 29 06:26:58 compute-1 sudo[106128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:26:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:58.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:26:58 compute-1 python3.9[106130]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:58 compute-1 sudo[106128]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:58 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:26:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:26:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:26:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:58.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:26:59 compute-1 ceph-mon[80754]: pgmap v435: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:26:59 compute-1 sudo[106280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nefxwbryooefpcuygrwokzrkkkdvzpgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397618.6505034-821-162867583342746/AnsiballZ_command.py'
Nov 29 06:26:59 compute-1 sudo[106280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:59 compute-1 python3.9[106282]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:26:59 compute-1 sudo[106280]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:00 compute-1 sudo[106433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmvojjdebyigbyqevfcgvpoeipazhiel ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397619.5817282-845-42067474296780/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:27:00 compute-1 sudo[106433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:00.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:27:00 compute-1 python3[106435]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:27:00 compute-1 sudo[106433]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:00 compute-1 sudo[106512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:27:00 compute-1 sudo[106512]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:00 compute-1 sudo[106512]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:00 compute-1 sudo[106560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:27:00 compute-1 sudo[106560]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:00 compute-1 sudo[106560]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:00.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:00 compute-1 sudo[106608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:27:00 compute-1 sudo[106608]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:00 compute-1 sudo[106608]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:00 compute-1 sudo[106668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlyxpmecuulivfuxzkmtjmuoruithqyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397620.504176-869-229976053069436/AnsiballZ_stat.py'
Nov 29 06:27:00 compute-1 sudo[106668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:00 compute-1 sudo[106655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:27:00 compute-1 sudo[106655]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:01 compute-1 python3.9[106683]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:01 compute-1 sudo[106668]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:01 compute-1 sudo[106789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxwzhovhounyzntyvllyafwpztbzzdnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397620.504176-869-229976053069436/AnsiballZ_file.py'
Nov 29 06:27:01 compute-1 sudo[106789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:01 compute-1 sudo[106655]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:01 compute-1 python3.9[106796]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:01 compute-1 sudo[106789]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:02 compute-1 sudo[106946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmvrmbzpczwjykhllofepjaitixbiarz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397621.7799022-905-128074493211762/AnsiballZ_stat.py'
Nov 29 06:27:02 compute-1 sudo[106946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:02.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:02 compute-1 python3.9[106948]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:02 compute-1 sudo[106946]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:02 compute-1 sudo[107025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nniwpbieiouytyqvfxnpilpuvybxvyng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397621.7799022-905-128074493211762/AnsiballZ_file.py'
Nov 29 06:27:02 compute-1 sudo[107025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:02.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:02 compute-1 python3.9[107027]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:02 compute-1 sudo[107025]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:03 compute-1 ceph-mon[80754]: pgmap v436: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:03 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:03 compute-1 sudo[107177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vskkanfydgghfiwlwhbxwdwsacvttdwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397623.2530336-941-248109435962108/AnsiballZ_stat.py'
Nov 29 06:27:03 compute-1 sudo[107177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:03 compute-1 python3.9[107179]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:03 compute-1 sudo[107177]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:04 compute-1 sudo[107255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftefemibknuuwzzwcfujxmmzpdtmbkee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397623.2530336-941-248109435962108/AnsiballZ_file.py'
Nov 29 06:27:04 compute-1 sudo[107255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:04.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:04 compute-1 python3.9[107257]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:04 compute-1 sudo[107255]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:04.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:04 compute-1 sudo[107407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lelsqwvzqckqmhimhdijzcwkebqcxuqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397624.5775585-978-103030095522018/AnsiballZ_stat.py'
Nov 29 06:27:04 compute-1 sudo[107407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:05 compute-1 python3.9[107409]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:05 compute-1 sudo[107407]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:05 compute-1 sudo[107485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzwvsuueeuqhqswvijkdmefzqpcxctww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397624.5775585-978-103030095522018/AnsiballZ_file.py'
Nov 29 06:27:05 compute-1 sudo[107485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:05 compute-1 python3.9[107487]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:05 compute-1 sudo[107485]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:06.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:06 compute-1 sudo[107637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrcedipdklixvdhzpcpsiajlasmnchze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397626.1696334-1013-276527419245075/AnsiballZ_stat.py'
Nov 29 06:27:06 compute-1 sudo[107637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:06.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:06 compute-1 python3.9[107639]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:06 compute-1 sudo[107637]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:07 compute-1 sudo[107715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjzwnwqztpmpuawmhwwieuuwoolqnojw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397626.1696334-1013-276527419245075/AnsiballZ_file.py'
Nov 29 06:27:07 compute-1 sudo[107715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:07 compute-1 python3.9[107717]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:07 compute-1 sudo[107715]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:08.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:08 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:08.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:09 compute-1 sudo[107867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upmuqjuiwbkmhyxscuqjkkiqztzprttc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397629.2205532-1053-41193576022437/AnsiballZ_command.py'
Nov 29 06:27:09 compute-1 sudo[107867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:09 compute-1 python3.9[107869]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:27:09 compute-1 sudo[107867]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:10.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:10 compute-1 sudo[108022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avwjltmonmnejlcztqanycjkjdcbjluc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397630.0644114-1076-9001731501358/AnsiballZ_blockinfile.py'
Nov 29 06:27:10 compute-1 sudo[108022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:10 compute-1 python3.9[108024]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:10 compute-1 sudo[108022]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:10.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:11 compute-1 sudo[108174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzbgdwobekhoqntgxqbkzmyawybiyyah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397631.110063-1103-81839649927575/AnsiballZ_file.py'
Nov 29 06:27:11 compute-1 sudo[108174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:11 compute-1 python3.9[108176]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:11 compute-1 sudo[108174]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:12 compute-1 sudo[108326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vukcgcolqurssdxfhqppegurrsfqlfuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397631.8578525-1103-8280745289362/AnsiballZ_file.py'
Nov 29 06:27:12 compute-1 sudo[108326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:12.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:12 compute-1 ceph-mon[80754]: pgmap v437: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:12 compute-1 ceph-mon[80754]: pgmap v438: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:12 compute-1 python3.9[108328]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:12 compute-1 sudo[108326]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:12 compute-1 sshd-session[106949]: error: kex_exchange_identification: read: Connection timed out
Nov 29 06:27:12 compute-1 sshd-session[106949]: banner exchange: Connection from 119.45.242.7 port 36644: Connection timed out
Nov 29 06:27:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:12.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:13 compute-1 sudo[108478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqpexrrklgmuctdzfpnnunspzvtxvnzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397632.6359212-1148-253175737868917/AnsiballZ_mount.py'
Nov 29 06:27:13 compute-1 sudo[108478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:13 compute-1 python3.9[108480]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 06:27:13 compute-1 sudo[108478]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:13 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:13 compute-1 sudo[108630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znouscfzfshmwgmewnlxracrvslfnybl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397633.5820851-1148-242716131509142/AnsiballZ_mount.py'
Nov 29 06:27:13 compute-1 sudo[108630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:14 compute-1 python3.9[108632]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 06:27:14 compute-1 sudo[108630]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:14.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:14 compute-1 ceph-mon[80754]: pgmap v439: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:27:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:27:14 compute-1 ceph-mon[80754]: pgmap v440: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:14 compute-1 ceph-mon[80754]: pgmap v441: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:14 compute-1 ceph-mon[80754]: pgmap v442: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:14 compute-1 ceph-mon[80754]: pgmap v443: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:14 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:27:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:14.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:14 compute-1 sshd-session[101085]: Connection closed by 192.168.122.30 port 41192
Nov 29 06:27:14 compute-1 sshd-session[101082]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:27:14 compute-1 systemd[1]: session-39.scope: Deactivated successfully.
Nov 29 06:27:14 compute-1 systemd[1]: session-39.scope: Consumed 32.826s CPU time.
Nov 29 06:27:14 compute-1 systemd-logind[785]: Session 39 logged out. Waiting for processes to exit.
Nov 29 06:27:14 compute-1 systemd-logind[785]: Removed session 39.
Nov 29 06:27:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:16.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:16 compute-1 sshd-session[108658]: Received disconnect from 45.55.249.98 port 47388:11: Bye Bye [preauth]
Nov 29 06:27:16 compute-1 sshd-session[108658]: Disconnected from authenticating user root 45.55.249.98 port 47388 [preauth]
Nov 29 06:27:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:16.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:18.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:18 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:18.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:27:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:27:19 compute-1 ceph-mon[80754]: pgmap v444: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:19 compute-1 ceph-mon[80754]: pgmap v445: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:19 compute-1 ceph-mon[80754]: pgmap v446: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:20.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:20.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:27:20 compute-1 sshd-session[108662]: Accepted publickey for zuul from 192.168.122.30 port 57306 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:27:21 compute-1 systemd-logind[785]: New session 40 of user zuul.
Nov 29 06:27:21 compute-1 systemd[1]: Started Session 40 of User zuul.
Nov 29 06:27:21 compute-1 sshd-session[108662]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:27:21 compute-1 sudo[108815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbvxzlmtqdsyvyxajlhkhxscdhfewmhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397641.135401-24-134283031368724/AnsiballZ_tempfile.py'
Nov 29 06:27:21 compute-1 sudo[108815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:21 compute-1 ceph-mon[80754]: pgmap v447: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:21 compute-1 python3.9[108817]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 29 06:27:21 compute-1 sudo[108815]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:21 compute-1 sshd-session[108660]: Received disconnect from 118.194.230.250 port 50782:11: Bye Bye [preauth]
Nov 29 06:27:21 compute-1 sshd-session[108660]: Disconnected from authenticating user root 118.194.230.250 port 50782 [preauth]
Nov 29 06:27:22 compute-1 sudo[108843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:27:22 compute-1 sudo[108843]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:22 compute-1 sudo[108843]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:22.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:22 compute-1 sudo[108891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:27:22 compute-1 sudo[108891]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:27:22 compute-1 sudo[108891]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:27:22 compute-1 sudo[109017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyubynwbojogfppyqlxenkhwwcggmbhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397642.1606414-60-186218579930862/AnsiballZ_stat.py'
Nov 29 06:27:22 compute-1 sudo[109017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:22.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:22 compute-1 python3.9[109019]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:27:22 compute-1 sudo[109017]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:23 compute-1 sudo[109171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sepzbyumozxjrrjbhxuvhbvwdwvkgygw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397643.0710588-84-97490835034433/AnsiballZ_slurp.py'
Nov 29 06:27:23 compute-1 sudo[109171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:23 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 06:27:23 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:23 compute-1 python3.9[109173]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Nov 29 06:27:23 compute-1 sudo[109171]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:24.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:24 compute-1 sudo[109325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzupcdvamesemqnngcfkqlsbjpvbxnqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397643.9492702-108-120182948309350/AnsiballZ_stat.py'
Nov 29 06:27:24 compute-1 sudo[109325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:24 compute-1 ceph-mon[80754]: pgmap v448: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:24 compute-1 python3.9[109327]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.t0fyas5g follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:24 compute-1 sudo[109325]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:24.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:24 compute-1 sudo[109450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymvsokckbutqjobqdxvijcoyqsgyufnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397643.9492702-108-120182948309350/AnsiballZ_copy.py'
Nov 29 06:27:24 compute-1 sudo[109450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:25 compute-1 python3.9[109452]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.t0fyas5g mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397643.9492702-108-120182948309350/.source.t0fyas5g _original_basename=.q5o1tos6 follow=False checksum=b291f010aefff8b88f41011b780271a83fd1182f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:25 compute-1 sudo[109450]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:26.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:26 compute-1 sudo[109602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juktbozawrlmknngiraukavqkrdlqwza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397645.4513166-153-15396650810352/AnsiballZ_setup.py'
Nov 29 06:27:26 compute-1 sudo[109602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:26 compute-1 python3.9[109604]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:27:26 compute-1 sudo[109602]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:26.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:27 compute-1 sudo[109754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-komosqvydslhafuakfqdvxsvsztmfity ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397646.8965364-178-133758238889869/AnsiballZ_blockinfile.py'
Nov 29 06:27:27 compute-1 sudo[109754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:27 compute-1 python3.9[109756]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2GXKCQiCwQEMihcSwDVeJtG2CpTemmA6MTbtOkxbB3OAV5PK8v8imPvDGMDurfGFQG0RzWyv9szlMJXdgIkwejIfy/AY7p6nemHOpu6DdAx0EA/jg1YcOIeeEhyMw1/oFzjYClGMohaI1oTKHtR29UXWphTAroOkf26Exvco6hh2ApRTXV9ObzSoOyCC7+OZcOWgYzdoCfu/0FDGkH2ksKLQS7d4AAh/XZ/njXhK57U7ptxHCReUPECGRv7KB4f8TelZDAIeUyp7ngd/9ivUDO1zue1Qr9ECzTzAFqippGXFmYl3+oSid03CY7bqnxav4xWt7UukbaO57goyIPfkklPdC1kA7kZqa9bqeDU1WgDkqnLu8hluArB0Y0Jz+hDfx9pTbAL6MklraoLaGrnrgcibAollAN+7WGqdWxUotENYaljO7P1Z18MlNllWFzk4Le5jMLNL8qArSlzM+ufOThnLdGEuYZhH1x969AisGQ4MQWn0P0lZFu6fE5VSNA/k=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDdPWx5WoFJTxz6PiFZL5f3XrtE682RjGFiIpoe0LXZO
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQlZMweHfLYiJFtm1r2tQze/oNx6KzgaXkK+Kof7POk0cFMLbTsXU8qgbQMh4o5LVO0Hbas4mAqxRkGcFCg2Po=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCX0dhB1m0xL0qEi5jnTQLLB4bvueVV5foNrqU/OkfV/4gRyp7uP2q21lWq5Dtl2GLk51pS6oD41RI41Y5g7OSRs8b1Z66d6X1QgX0Qns6pv7FwmNSQ25+2VGV6lppnaN5e+JHiwTmzpf82hl/MiiJrHo7B63mllKyl9SZJxUhP9RR4czS3QNYQsZyP7sZeCWothTZ2Q/GK4BWBEtj2+ifeOpa342IivopCH05YVQOx9bpsdFHMYaalMDCwvr2lfVns8aTcpJ3z9uE8wLdKWTyiinT7nuLX6RuPwhXB2proBRH1wrGSIUgcVcizkWn8QizD8LlsGFcHIQJkmq+sJz6r7cCZLIfS6hdAzI+hYbJie6n/agwfxe4r+mbXsmmC6ALKKk7CEnaiNnDg0fgTaUfBPwSfu+JmVrjdSO+S8f/CMbtYeO6QknOxhLV9oK6knszv7nLlSYXTzXanHkN4Y0fW3dsSvoE+qDR0YijbbT8slqMd6z95wWVDFUmTcN8Nzk8=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILci1PI4hoB56+xxS5gSMKceuJ/dv6t7etpmtENwoSFr
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJIaOLr2ntjSUcigXC7a0sFoonsuh0ChCx2a1R6G8EDmJ8/ZB8NEiJE6KAQJDNU5XsXjuaC44eJhOUMRK9r98xA=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUVpPatup3d17omeiTdJaYR8jCcDbraJSPBxWy49Wxst4G+6/lD41HVIKmjgCgIbbmYSFBPQmoXt4gFXP4FRKna6AbQWi0kwF3/T2biQ2qCid0HVDSS8YRVlyrpdVc1/bIg6YNLkGnhzOMp0S1443+cg5PqutAbrAT1LOg6lSBu+K9gIqJ4un3l2guSweoyba5UhMyjrq4Pffx1QCuBggtYSjmA9Q1r5VVNc2J7AbP0QuzOe6J6DhpdGJsfmHDVXZb/4b/aPUdCTKkLseyUtcqElWVhhnGnpYSJdN81ejalSktGHE4JRHih19wwTokiKvoczUgijBzOfl+kt2ELcpDgzpzY0M9yd0Zz7wrK4rLM6hi8x3LYZXZv8N7KnawUcJ2jfzilx1BVLdNzgwDNB7ZlP4O9Vs3fKnBufCUFPNcRyWl6ooczepbgxqgSbr/Ham2O4/qzvJmzLtu0KxBkaFALRWnyM39nYVE/jrMKJ5ihtVDxIY9FGma/Jifg15gqI0=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN19pK3a7AH/OiwlqJTVWP/qzU/QzkC16s4D1xY1Vn6J
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLsXsjJNPVMX1YVTe2oBmcZpUSiv3HOeuICgZtQun4hTopMXH9dE1jQeUruGwqZ+NsKW6X2bLZZJ0/tcn2owL8Q=
                                              create=True mode=0644 path=/tmp/ansible.t0fyas5g state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:27 compute-1 sudo[109754]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:28 compute-1 sudo[109906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmuhmgspqodxrqemllezpblcenxnwljp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397647.751911-202-48115128004359/AnsiballZ_command.py'
Nov 29 06:27:28 compute-1 sudo[109906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:28.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:28 compute-1 python3.9[109908]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.t0fyas5g' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:27:28 compute-1 sudo[109906]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:28 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:28 compute-1 ceph-mon[80754]: pgmap v449: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:28.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:29 compute-1 sudo[110060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeajkozbztzpjnlhcqthqfniwgrrpido ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397648.639601-226-61275429170468/AnsiballZ_file.py'
Nov 29 06:27:29 compute-1 sudo[110060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:29 compute-1 python3.9[110062]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.t0fyas5g state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:29 compute-1 sudo[110060]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:30 compute-1 sshd-session[108665]: Connection closed by 192.168.122.30 port 57306
Nov 29 06:27:30 compute-1 sshd-session[108662]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:27:30 compute-1 systemd-logind[785]: Session 40 logged out. Waiting for processes to exit.
Nov 29 06:27:30 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Nov 29 06:27:30 compute-1 systemd[1]: session-40.scope: Consumed 5.820s CPU time.
Nov 29 06:27:30 compute-1 systemd-logind[785]: Removed session 40.
Nov 29 06:27:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:30.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:30 compute-1 ceph-mon[80754]: pgmap v450: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:30 compute-1 ceph-mon[80754]: pgmap v451: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:30.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:31 compute-1 ceph-mon[80754]: pgmap v452: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:32.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:32.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:33 compute-1 ceph-mon[80754]: pgmap v453: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:33 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:34.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:34.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:34 compute-1 sshd-session[110087]: Accepted publickey for zuul from 192.168.122.30 port 33156 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:27:34 compute-1 systemd-logind[785]: New session 41 of user zuul.
Nov 29 06:27:35 compute-1 systemd[1]: Started Session 41 of User zuul.
Nov 29 06:27:35 compute-1 sshd-session[110087]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:27:35 compute-1 ceph-mon[80754]: pgmap v454: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:36 compute-1 python3.9[110240]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:27:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:36.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:36.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:37 compute-1 sudo[110394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwkwudlcxiihwyskbaxcdnaxkoygenaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397656.4730294-62-81100966230112/AnsiballZ_systemd.py'
Nov 29 06:27:37 compute-1 sudo[110394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:37 compute-1 python3.9[110396]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 06:27:37 compute-1 sudo[110394]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:38 compute-1 sudo[110548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojqlqgbxpdarlfdbiyfsqciobvppvbzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397657.8118963-86-212771426042448/AnsiballZ_systemd.py'
Nov 29 06:27:38 compute-1 sudo[110548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:38 compute-1 ceph-mon[80754]: pgmap v455: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:38.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:38 compute-1 python3.9[110550]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:27:38 compute-1 sudo[110548]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:38 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:38.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:39 compute-1 sudo[110701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwywjwkxpolfdqifehuqhhxtyxewcvtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397658.7481794-113-268103788327313/AnsiballZ_command.py'
Nov 29 06:27:39 compute-1 sudo[110701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:39 compute-1 python3.9[110703]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:27:39 compute-1 sudo[110701]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:40.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:40.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:27:41 compute-1 ceph-mon[80754]: pgmap v456: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:41 compute-1 sudo[110854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmioycctysqfgtrqresrhkuzpqwyrnsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397660.7277365-137-138950153809715/AnsiballZ_stat.py'
Nov 29 06:27:41 compute-1 sudo[110854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:41 compute-1 python3.9[110856]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:27:41 compute-1 sudo[110854]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:42.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:42 compute-1 sudo[111006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdzmrdvirggswlpktmxijugwkcyuatgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397661.758056-164-178815523536330/AnsiballZ_file.py'
Nov 29 06:27:42 compute-1 sudo[111006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:42 compute-1 python3.9[111008]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:42 compute-1 sudo[111006]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:42.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:42 compute-1 sshd-session[110090]: Connection closed by 192.168.122.30 port 33156
Nov 29 06:27:42 compute-1 sshd-session[110087]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:27:42 compute-1 systemd-logind[785]: Session 41 logged out. Waiting for processes to exit.
Nov 29 06:27:42 compute-1 systemd[1]: session-41.scope: Deactivated successfully.
Nov 29 06:27:42 compute-1 systemd[1]: session-41.scope: Consumed 4.317s CPU time.
Nov 29 06:27:42 compute-1 systemd-logind[785]: Removed session 41.
Nov 29 06:27:43 compute-1 ceph-mon[80754]: pgmap v457: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:43 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:44.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:44.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:46.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:46.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:47 compute-1 ceph-mon[80754]: pgmap v458: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:48 compute-1 sshd-session[111033]: Accepted publickey for zuul from 192.168.122.30 port 53608 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:27:48 compute-1 systemd-logind[785]: New session 42 of user zuul.
Nov 29 06:27:48 compute-1 systemd[1]: Started Session 42 of User zuul.
Nov 29 06:27:48 compute-1 sshd-session[111033]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:27:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:48.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:48.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:49 compute-1 python3.9[111186]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:27:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:50 compute-1 sudo[111340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-layvbhhdredkmeoypmhrlapzlnirnawz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397669.7299001-68-29957180019501/AnsiballZ_setup.py'
Nov 29 06:27:50 compute-1 sudo[111340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:50.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:50 compute-1 python3.9[111342]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:27:50 compute-1 sudo[111340]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:50.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:51 compute-1 sudo[111424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mluhmhuujzlxoacysndgvbgbslihgthx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397669.7299001-68-29957180019501/AnsiballZ_dnf.py'
Nov 29 06:27:51 compute-1 sudo[111424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:51 compute-1 python3.9[111426]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:27:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:52.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:52.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:53 compute-1 sudo[111424]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:53 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:27:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:54.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:27:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:27:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:54.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:27:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:27:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:56.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:27:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:56.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:57 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:27:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:58.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:27:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:27:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:58.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:27:59 compute-1 ceph-mon[80754]: pgmap v459: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:27:59 compute-1 ceph-mon[80754]: pgmap v460: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:00.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:28:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:00.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:28:01 compute-1 sshd-session[111428]: error: kex_exchange_identification: read: Connection timed out
Nov 29 06:28:01 compute-1 sshd-session[111428]: banner exchange: Connection from 119.45.242.7 port 47404: Connection timed out
Nov 29 06:28:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:28:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:02.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:28:02 compute-1 python3.9[111578]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:28:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:28:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:02.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:28:03 compute-1 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:28:03 compute-1 python3.9[111729]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:28:03 compute-1 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:28:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:04.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:28:04 compute-1 python3.9[111879]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:28:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:04.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:05 compute-1 python3.9[112029]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:28:06 compute-1 sshd-session[111036]: Connection closed by 192.168.122.30 port 53608
Nov 29 06:28:06 compute-1 sshd-session[111033]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:28:06 compute-1 systemd-logind[785]: Session 42 logged out. Waiting for processes to exit.
Nov 29 06:28:06 compute-1 systemd[1]: session-42.scope: Deactivated successfully.
Nov 29 06:28:06 compute-1 systemd[1]: session-42.scope: Consumed 6.357s CPU time.
Nov 29 06:28:06 compute-1 systemd-logind[785]: Removed session 42.
Nov 29 06:28:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:06.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:06.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:08.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:08 compute-1 ceph-mon[80754]: pgmap v467: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:08 compute-1 ceph-mon[80754]: pgmap v468: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:08 compute-1 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 06:28:08 compute-1 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 06:28:08 compute-1 ceph-mon[80754]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 06:28:08 compute-1 ceph-mon[80754]: monmap e3: 3 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],compute-1=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],compute-2=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]} removed_ranks: {} disallowed_leaders: {}
Nov 29 06:28:08 compute-1 ceph-mon[80754]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:28:08 compute-1 ceph-mon[80754]: osdmap e139: 3 total, 3 up, 3 in
Nov 29 06:28:08 compute-1 ceph-mon[80754]: mgrmap e10: compute-0.vxabpq(active, since 11m), standbys: compute-2.ngsyhe, compute-1.gaxpay
Nov 29 06:28:08 compute-1 ceph-mon[80754]: overall HEALTH_OK
Nov 29 06:28:08 compute-1 ceph-mon[80754]: pgmap v469: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:08.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:09 compute-1 ceph-mon[80754]: pgmap v470: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:09 compute-1 ceph-mon[80754]: pgmap v471: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:10.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:10.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:11 compute-1 sshd-session[112054]: Accepted publickey for zuul from 192.168.122.30 port 48338 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:28:11 compute-1 systemd-logind[785]: New session 43 of user zuul.
Nov 29 06:28:11 compute-1 systemd[1]: Started Session 43 of User zuul.
Nov 29 06:28:11 compute-1 sshd-session[112054]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:28:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:12.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:12 compute-1 python3.9[112207]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:28:12 compute-1 ceph-mon[80754]: pgmap v472: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:12.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:14.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:14 compute-1 sudo[112361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-layzriexlefhlhmgcgfbfzxydlcnehye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397693.944413-118-86959341208401/AnsiballZ_file.py'
Nov 29 06:28:14 compute-1 sudo[112361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:14 compute-1 python3.9[112363]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:14 compute-1 sudo[112361]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:14 compute-1 ceph-mon[80754]: pgmap v473: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:14.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:15 compute-1 sudo[112513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aspbisfyqdevomslscmtrhyjifttibgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397694.7296276-118-240508501733058/AnsiballZ_file.py'
Nov 29 06:28:15 compute-1 sudo[112513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:15 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:15 compute-1 python3.9[112515]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:15 compute-1 sudo[112513]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:15 compute-1 sudo[112665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkriwqwndfysayhqkqowsqxblbftytpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397695.4851682-163-217729628174384/AnsiballZ_stat.py'
Nov 29 06:28:15 compute-1 sudo[112665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:16 compute-1 python3.9[112667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:16 compute-1 sudo[112665]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:16.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:16 compute-1 ceph-mon[80754]: pgmap v474: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:16 compute-1 sudo[112788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spmkuqzplptsvzxpqtovbwvmrnschqbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397695.4851682-163-217729628174384/AnsiballZ_copy.py'
Nov 29 06:28:16 compute-1 sudo[112788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:16 compute-1 python3.9[112790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397695.4851682-163-217729628174384/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=c753fd50f03190549921f4ec9ebe197ccf1ffe37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:16 compute-1 sudo[112788]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:16.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:17 compute-1 sudo[112940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cubkpoqjfjivhbdwsdaghuoykmtwogag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397697.0490756-163-7455501384218/AnsiballZ_stat.py'
Nov 29 06:28:17 compute-1 sudo[112940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:17 compute-1 python3.9[112942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:17 compute-1 sudo[112940]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:18 compute-1 sudo[113063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxkfdtfuxuqoifgigxxzthiakrttommh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397697.0490756-163-7455501384218/AnsiballZ_copy.py'
Nov 29 06:28:18 compute-1 sudo[113063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:18.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:18 compute-1 python3.9[113065]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397697.0490756-163-7455501384218/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=03c2952c2692ca442730881904078ac3e566f340 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:18 compute-1 sudo[113063]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:18 compute-1 sudo[113215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjkekuedfwvhmvqqljfwjyrmlvxqylsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397698.4946883-163-117985615079690/AnsiballZ_stat.py'
Nov 29 06:28:18 compute-1 sudo[113215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:18.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:18 compute-1 python3.9[113217]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:19 compute-1 sudo[113215]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:19 compute-1 sudo[113338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swvmsricjdlhjfuyjptczbimrmvyujom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397698.4946883-163-117985615079690/AnsiballZ_copy.py'
Nov 29 06:28:19 compute-1 sudo[113338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:19 compute-1 python3.9[113340]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397698.4946883-163-117985615079690/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=27db9be1e23c3016377de86e7cf7031ed01bcf2d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:19 compute-1 sudo[113338]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:20 compute-1 sudo[113490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvocgptlorcunzcetihqfniooglmanrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397699.9333296-297-122727342703512/AnsiballZ_file.py'
Nov 29 06:28:20 compute-1 sudo[113490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:20 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:20.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:20 compute-1 python3.9[113492]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:20 compute-1 sudo[113490]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:28:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:20.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:28:21 compute-1 sudo[113642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cohhxpondnezcogjobcppxgkygofwicb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397700.604445-297-232351022508631/AnsiballZ_file.py'
Nov 29 06:28:21 compute-1 sudo[113642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:21 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:28:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:22.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:22 compute-1 sudo[113646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:22 compute-1 sudo[113646]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:22 compute-1 sudo[113646]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:22 compute-1 sudo[113671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:28:22 compute-1 sudo[113671]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:22 compute-1 sudo[113671]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:22 compute-1 python3.9[113644]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:22 compute-1 sudo[113642]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:22 compute-1 sudo[113696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:22 compute-1 sudo[113696]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:22 compute-1 sudo[113696]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:22 compute-1 sudo[113740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Nov 29 06:28:22 compute-1 sudo[113740]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:22.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:22 compute-1 sudo[113740]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:23 compute-1 sudo[113915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuinzsjxikkeyocfuwudpchzpqotidwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397702.792364-374-62534259531908/AnsiballZ_stat.py'
Nov 29 06:28:23 compute-1 sudo[113915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:23 compute-1 python3.9[113917]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:23 compute-1 sudo[113915]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:23 compute-1 ceph-mon[80754]: pgmap v475: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:23 compute-1 sudo[114038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmeljcxzmxfvhdwxuwsjqbgtoiqrddkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397702.792364-374-62534259531908/AnsiballZ_copy.py'
Nov 29 06:28:23 compute-1 sudo[114038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:23 compute-1 python3.9[114040]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397702.792364-374-62534259531908/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=af9225c5d9213edb8553d0100161ec5ac71c6435 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:23 compute-1 sudo[114038]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:24 compute-1 sudo[114192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqoghwygpemoatezlkdtcdpqpyrzngol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397704.0555966-374-199806165432541/AnsiballZ_stat.py'
Nov 29 06:28:24 compute-1 sudo[114192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:24.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:24 compute-1 sudo[114190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:24 compute-1 sudo[114190]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:24 compute-1 sudo[114190]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:24 compute-1 sudo[114218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:28:24 compute-1 sudo[114218]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:24 compute-1 sudo[114218]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:24 compute-1 sudo[114243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:24 compute-1 sudo[114243]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:24 compute-1 sudo[114243]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:24 compute-1 python3.9[114205]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:24 compute-1 sudo[114192]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:24 compute-1 sudo[114268]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:28:24 compute-1 sudo[114268]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:24.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:25 compute-1 sudo[114268]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:25 compute-1 ceph-mon[80754]: pgmap v476: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:25 compute-1 ceph-mon[80754]: pgmap v477: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:25 compute-1 ceph-mon[80754]: pgmap v478: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:25 compute-1 sudo[114445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejrikeqvsvidvapbfkwucanvjthsxiif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397704.0555966-374-199806165432541/AnsiballZ_copy.py'
Nov 29 06:28:25 compute-1 sudo[114445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:25 compute-1 python3.9[114447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397704.0555966-374-199806165432541/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=446989bd92736b57ebc923ce429d8effafd00e68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:25 compute-1 sudo[114445]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:25 compute-1 sudo[114597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faraysgdtwhtrtqjyynrxedrmohvwssj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397705.5784872-374-236808187045402/AnsiballZ_stat.py'
Nov 29 06:28:25 compute-1 sudo[114597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:26 compute-1 python3.9[114599]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:26 compute-1 sudo[114597]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:26 compute-1 ceph-mon[80754]: pgmap v479: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:28:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:28:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:28:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:28:26 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:28:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:26.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:26 compute-1 sudo[114720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnttrxtbknehviypjhcpuwlgfuxmxrrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397705.5784872-374-236808187045402/AnsiballZ_copy.py'
Nov 29 06:28:26 compute-1 sudo[114720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:26 compute-1 python3.9[114722]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397705.5784872-374-236808187045402/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=b27264d68008dc068de4ee4a6430b05babb8b7a6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:26 compute-1 sudo[114720]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:26.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:27 compute-1 sudo[114874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzzjytcensniocytyaoyahoeemegmvoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397706.921854-509-144346248229659/AnsiballZ_file.py'
Nov 29 06:28:27 compute-1 sudo[114874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:27 compute-1 python3.9[114876]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:27 compute-1 sudo[114874]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:27 compute-1 sudo[115026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jydtgkgeymeazlvkjkxampiqolmwpqat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397707.6297677-509-156488481976765/AnsiballZ_file.py'
Nov 29 06:28:27 compute-1 sudo[115026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:28 compute-1 python3.9[115028]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:28 compute-1 sudo[115026]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:28.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:28 compute-1 sudo[115178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jopvtfrensfplmncaetjczfvsbtwigae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397708.3308923-554-103063587327110/AnsiballZ_stat.py'
Nov 29 06:28:28 compute-1 sudo[115178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:28.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:29 compute-1 python3.9[115180]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:29 compute-1 sudo[115178]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:29 compute-1 ceph-mon[80754]: pgmap v480: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:29 compute-1 sudo[115301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwkvcyybvcgpilypbeyhpfnmlpyqekxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397708.3308923-554-103063587327110/AnsiballZ_copy.py'
Nov 29 06:28:29 compute-1 sudo[115301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:29 compute-1 python3.9[115303]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397708.3308923-554-103063587327110/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=211e9c84831fe02b2c1e90a47350bc311a668a8e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:29 compute-1 sudo[115301]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:30 compute-1 sudo[115453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wauyrnqxjixkrxomjafepnyskhjzaota ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397709.940725-554-184543457979669/AnsiballZ_stat.py'
Nov 29 06:28:30 compute-1 sudo[115453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:30.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:30 compute-1 python3.9[115455]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:30 compute-1 sudo[115453]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:30 compute-1 sudo[115576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnocguvrrajrecvosusbtovtnodzzlmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397709.940725-554-184543457979669/AnsiballZ_copy.py'
Nov 29 06:28:30 compute-1 sudo[115576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:30.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:31 compute-1 python3.9[115578]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397709.940725-554-184543457979669/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=446989bd92736b57ebc923ce429d8effafd00e68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:31 compute-1 sudo[115576]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:31 compute-1 sudo[115728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwhisdolqlsgcmlnoyqpqncrisyiuves ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397711.2354426-554-83206310104670/AnsiballZ_stat.py'
Nov 29 06:28:31 compute-1 sudo[115728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:31 compute-1 python3.9[115730]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:31 compute-1 sudo[115728]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:32 compute-1 ceph-mon[80754]: pgmap v481: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:32.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:32 compute-1 sudo[115851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nymtfwsvlkmuxtepxtorfjvokpynqlqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397711.2354426-554-83206310104670/AnsiballZ_copy.py'
Nov 29 06:28:32 compute-1 sudo[115851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:32 compute-1 python3.9[115853]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397711.2354426-554-83206310104670/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=5d7eb31663823a154f0a44a495e71d206222dec7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:32 compute-1 sudo[115851]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:32.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:33 compute-1 sshd-session[114813]: Connection closed by 66.94.122.234 port 53274 [preauth]
Nov 29 06:28:33 compute-1 sudo[116003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgvhsgysawqjzwiwjzjpetqaqlqzbucg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397713.58093-746-179361860832441/AnsiballZ_file.py'
Nov 29 06:28:33 compute-1 sudo[116003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:34 compute-1 python3.9[116005]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:34 compute-1 sudo[116003]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:34 compute-1 ceph-mon[80754]: pgmap v482: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:34.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:34 compute-1 sudo[116155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwyqlkqyjxpvoooqrpgvgjlgjkfukvvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397714.381309-771-91806314075671/AnsiballZ_stat.py'
Nov 29 06:28:34 compute-1 sudo[116155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:34 compute-1 python3.9[116157]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:34 compute-1 sudo[116155]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:28:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:34.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:28:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:35 compute-1 sudo[116278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzbnjhjorgxccyamifutvalqhkawyalg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397714.381309-771-91806314075671/AnsiballZ_copy.py'
Nov 29 06:28:35 compute-1 sudo[116278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:35 compute-1 python3.9[116280]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397714.381309-771-91806314075671/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:35 compute-1 sudo[116278]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:28:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 5840 writes, 25K keys, 5840 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 5840 writes, 940 syncs, 6.21 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 5840 writes, 25K keys, 5840 commit groups, 1.0 writes per commit group, ingest: 19.19 MB, 0.03 MB/s
                                           Interval WAL: 5840 writes, 940 syncs, 6.21 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Nov 29 06:28:36 compute-1 sudo[116430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zypwypcmivdqvmlkqxfbneblxcrzbdcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397715.8479493-822-110439396068322/AnsiballZ_file.py'
Nov 29 06:28:36 compute-1 sudo[116430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:36 compute-1 ceph-mon[80754]: pgmap v483: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:36 compute-1 ceph-mon[80754]: pgmap v484: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:36.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:36 compute-1 python3.9[116432]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:36 compute-1 sudo[116430]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:36 compute-1 sudo[116582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdwsrpdfbosnztbhndiyqpgswpurieys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397716.6355143-849-130975966596119/AnsiballZ_stat.py'
Nov 29 06:28:36 compute-1 sudo[116582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:36.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:37 compute-1 python3.9[116584]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:37 compute-1 sudo[116582]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:37 compute-1 sudo[116707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdtmyueclhukhafqjuyxqwekpcobibrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397716.6355143-849-130975966596119/AnsiballZ_copy.py'
Nov 29 06:28:37 compute-1 sudo[116707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:37 compute-1 python3.9[116709]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397716.6355143-849-130975966596119/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:37 compute-1 sudo[116707]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:38 compute-1 sudo[116859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqkbytwklhjtvdcamjqhttndkzmadcci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397718.0267966-901-196655512714098/AnsiballZ_file.py'
Nov 29 06:28:38 compute-1 sudo[116859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:38.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:38 compute-1 python3.9[116861]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:38 compute-1 sshd-session[116680]: Received disconnect from 118.194.230.250 port 50892:11: Bye Bye [preauth]
Nov 29 06:28:38 compute-1 sshd-session[116680]: Disconnected from authenticating user root 118.194.230.250 port 50892 [preauth]
Nov 29 06:28:38 compute-1 sudo[116859]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:38.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:39 compute-1 sudo[117011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omaiodkptzkyhpnyxrscsysrhhsvjvwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397718.8209403-922-174509620318474/AnsiballZ_stat.py'
Nov 29 06:28:39 compute-1 sudo[117011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:39 compute-1 python3.9[117013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:39 compute-1 sudo[117011]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:39 compute-1 sudo[117134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugwmyxcorfippwstppdyxpjqjhspfjxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397718.8209403-922-174509620318474/AnsiballZ_copy.py'
Nov 29 06:28:39 compute-1 sudo[117134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:40 compute-1 ceph-mon[80754]: pgmap v485: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:40 compute-1 python3.9[117136]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397718.8209403-922-174509620318474/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:40 compute-1 sudo[117134]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:40.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:40 compute-1 sudo[117286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjiutnmaleogwrwizuylafeikrukvvwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397720.274358-970-89479640116974/AnsiballZ_file.py'
Nov 29 06:28:40 compute-1 sudo[117286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:40 compute-1 python3.9[117288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:40 compute-1 ceph-mon[80754]: pgmap v486: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:40 compute-1 sudo[117286]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:40.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:41 compute-1 sudo[117438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpjxamtdaqawwpygqwfpivcaewnlzkuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397721.024441-994-78353063345621/AnsiballZ_stat.py'
Nov 29 06:28:41 compute-1 sudo[117438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:41 compute-1 python3.9[117440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:41 compute-1 sudo[117438]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:41 compute-1 ceph-mon[80754]: pgmap v487: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:41 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:41 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:28:41 compute-1 sudo[117488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:28:41 compute-1 sudo[117488]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:41 compute-1 sudo[117488]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:41 compute-1 sudo[117535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:28:41 compute-1 sudo[117535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:28:41 compute-1 sudo[117535]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:42 compute-1 sudo[117611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pltjvazezwfqwgloluinvpgzainipugb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397721.024441-994-78353063345621/AnsiballZ_copy.py'
Nov 29 06:28:42 compute-1 sudo[117611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:42 compute-1 python3.9[117613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397721.024441-994-78353063345621/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:42 compute-1 sudo[117611]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:42.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:42 compute-1 sudo[117763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jauhjzmbjyafdxfdcfyytlyoqvigmvvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397722.5115252-1043-241797427594480/AnsiballZ_file.py'
Nov 29 06:28:42 compute-1 sudo[117763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:42.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:43 compute-1 python3.9[117765]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:43 compute-1 sudo[117763]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:43 compute-1 ceph-mon[80754]: pgmap v488: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:43 compute-1 sudo[117918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfnqtsshiqdgmnpkjxoiogbbpcnpcyjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397723.2541199-1059-236971876014537/AnsiballZ_stat.py'
Nov 29 06:28:43 compute-1 sudo[117918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:43 compute-1 python3.9[117920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:43 compute-1 sudo[117918]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:44 compute-1 sudo[118041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yejiygfrfwtoslungmkhdsegfzzpjsfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397723.2541199-1059-236971876014537/AnsiballZ_copy.py'
Nov 29 06:28:44 compute-1 sudo[118041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:44 compute-1 sshd-session[117766]: Invalid user sol from 80.94.92.182 port 46208
Nov 29 06:28:44 compute-1 python3.9[118043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397723.2541199-1059-236971876014537/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:44 compute-1 sudo[118041]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:44.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:44 compute-1 sshd-session[117766]: Connection closed by invalid user sol 80.94.92.182 port 46208 [preauth]
Nov 29 06:28:44 compute-1 sudo[118193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbqcaickmlnnknxgftbibdzuruxrgrbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397724.5690622-1092-239260348533279/AnsiballZ_file.py'
Nov 29 06:28:44 compute-1 sudo[118193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:44.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:45 compute-1 python3.9[118195]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:45 compute-1 sudo[118193]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:45 compute-1 sudo[118345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrfgicqpfxpyegtxhdqbwltrkojkdkrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397725.383341-1107-124750937152171/AnsiballZ_stat.py'
Nov 29 06:28:45 compute-1 sudo[118345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:45 compute-1 python3.9[118347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:45 compute-1 sudo[118345]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:46 compute-1 ceph-mon[80754]: pgmap v489: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:46.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:46 compute-1 sudo[118468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcpcejvsgrahnxcgoutlkojcstsjrhlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397725.383341-1107-124750937152171/AnsiballZ_copy.py'
Nov 29 06:28:46 compute-1 sudo[118468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:46 compute-1 python3.9[118470]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397725.383341-1107-124750937152171/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:46 compute-1 sudo[118468]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:28:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:46.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:28:47 compute-1 sshd-session[112057]: Connection closed by 192.168.122.30 port 48338
Nov 29 06:28:47 compute-1 sshd-session[112054]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:28:47 compute-1 systemd[1]: session-43.scope: Deactivated successfully.
Nov 29 06:28:47 compute-1 systemd[1]: session-43.scope: Consumed 25.559s CPU time.
Nov 29 06:28:47 compute-1 systemd-logind[785]: Session 43 logged out. Waiting for processes to exit.
Nov 29 06:28:47 compute-1 systemd-logind[785]: Removed session 43.
Nov 29 06:28:47 compute-1 ceph-mon[80754]: pgmap v490: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:48.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:48.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:49 compute-1 sshd-session[117771]: Connection closed by 119.45.242.7 port 58180 [preauth]
Nov 29 06:28:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:50.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:50 compute-1 ceph-mon[80754]: pgmap v491: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:50.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:52.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:52 compute-1 sshd-session[118496]: Accepted publickey for zuul from 192.168.122.30 port 47930 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:28:52 compute-1 systemd-logind[785]: New session 44 of user zuul.
Nov 29 06:28:52 compute-1 systemd[1]: Started Session 44 of User zuul.
Nov 29 06:28:52 compute-1 sshd-session[118496]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:28:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:52.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:53 compute-1 ceph-mon[80754]: pgmap v492: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:53 compute-1 sudo[118649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mndralstnodtoakfkzpintlaftzzqqyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397732.802289-32-163760563098868/AnsiballZ_file.py'
Nov 29 06:28:53 compute-1 sudo[118649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:53 compute-1 python3.9[118651]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:53 compute-1 sudo[118649]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:54 compute-1 sudo[118801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-turcweuvgcvnuvhntrevdbrdpqildhwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397733.799546-68-184508673086860/AnsiballZ_stat.py'
Nov 29 06:28:54 compute-1 sudo[118801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:54 compute-1 python3.9[118803]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:54.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:54 compute-1 sudo[118801]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:54 compute-1 ceph-mon[80754]: pgmap v493: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:28:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:54.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:28:55 compute-1 sudo[118924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rggbdoeqfadxelowfvmmqyavahyhfqov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397733.799546-68-184508673086860/AnsiballZ_copy.py'
Nov 29 06:28:55 compute-1 sudo[118924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:55 compute-1 python3.9[118926]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397733.799546-68-184508673086860/.source.conf _original_basename=ceph.conf follow=False checksum=b678e866ce48244e104f356f74865d3398155ff0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:28:55 compute-1 sudo[118924]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:55 compute-1 sudo[119076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpviwlaebuwhngesgzgoulyidtecmwlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397735.4348497-68-77469776433554/AnsiballZ_stat.py'
Nov 29 06:28:55 compute-1 sudo[119076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:55 compute-1 python3.9[119078]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:56 compute-1 sudo[119076]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:56 compute-1 ceph-mon[80754]: pgmap v494: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:56 compute-1 sudo[119199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxbcetcaewqivrhlistyewzcxrgkrwgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397735.4348497-68-77469776433554/AnsiballZ_copy.py'
Nov 29 06:28:56 compute-1 sudo[119199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:56.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:56 compute-1 python3.9[119201]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397735.4348497-68-77469776433554/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=d5bc1b1c0617b147c8e3e13846b179249a244079 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:56 compute-1 sudo[119199]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:56 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Nov 29 06:28:56 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:56.881358) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:28:56 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Nov 29 06:28:56 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736881526, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1353, "num_deletes": 253, "total_data_size": 3254957, "memory_usage": 3303176, "flush_reason": "Manual Compaction"}
Nov 29 06:28:56 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Nov 29 06:28:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:56 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736986792, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1349524, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7855, "largest_seqno": 9203, "table_properties": {"data_size": 1344764, "index_size": 2156, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12194, "raw_average_key_size": 20, "raw_value_size": 1334447, "raw_average_value_size": 2246, "num_data_blocks": 99, "num_entries": 594, "num_filter_entries": 594, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397574, "oldest_key_time": 1764397574, "file_creation_time": 1764397736, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:28:56 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 105507 microseconds, and 5986 cpu microseconds.
Nov 29 06:28:56 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:28:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:56.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:57 compute-1 sshd-session[118499]: Connection closed by 192.168.122.30 port 47930
Nov 29 06:28:57 compute-1 sshd-session[118496]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:28:57 compute-1 systemd[1]: session-44.scope: Deactivated successfully.
Nov 29 06:28:57 compute-1 systemd[1]: session-44.scope: Consumed 3.149s CPU time.
Nov 29 06:28:57 compute-1 systemd-logind[785]: Session 44 logged out. Waiting for processes to exit.
Nov 29 06:28:57 compute-1 systemd-logind[785]: Removed session 44.
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:56.986876) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1349524 bytes OK
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:56.986910) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.078766) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.078846) EVENT_LOG_v1 {"time_micros": 1764397737078830, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.078885) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 3248581, prev total WAL file size 3264022, number of live WAL files 2.
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.124337) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323534' seq:0, type:0; will stop at (end)
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1317KB)], [15(10002KB)]
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397737124679, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 11592103, "oldest_snapshot_seqno": -1}
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3886 keys, 9449294 bytes, temperature: kUnknown
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397737453892, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9449294, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9417812, "index_size": 20684, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9733, "raw_key_size": 95405, "raw_average_key_size": 24, "raw_value_size": 9341875, "raw_average_value_size": 2403, "num_data_blocks": 911, "num_entries": 3886, "num_filter_entries": 3886, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764397737, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.454229) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9449294 bytes
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.497539) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 35.2 rd, 28.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 9.8 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(15.6) write-amplify(7.0) OK, records in: 4364, records dropped: 478 output_compression: NoCompression
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.497594) EVENT_LOG_v1 {"time_micros": 1764397737497573, "job": 6, "event": "compaction_finished", "compaction_time_micros": 329344, "compaction_time_cpu_micros": 31293, "output_level": 6, "num_output_files": 1, "total_output_size": 9449294, "num_input_records": 4364, "num_output_records": 3886, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397737498278, "job": 6, "event": "table_file_deletion", "file_number": 17}
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397737500810, "job": 6, "event": "table_file_deletion", "file_number": 15}
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.124196) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.500994) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.501001) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.501002) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.501004) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.501005) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:28:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:58.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:28:58 compute-1 ceph-mon[80754]: pgmap v495: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:28:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:28:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:28:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:58.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:00 compute-1 ceph-mon[80754]: pgmap v496: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:29:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:00.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:29:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:00.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:02.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:02 compute-1 ceph-mon[80754]: pgmap v497: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:02 compute-1 sshd-session[119226]: Accepted publickey for zuul from 192.168.122.30 port 57718 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:29:02 compute-1 systemd-logind[785]: New session 45 of user zuul.
Nov 29 06:29:02 compute-1 systemd[1]: Started Session 45 of User zuul.
Nov 29 06:29:02 compute-1 sshd-session[119226]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:29:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:02.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:04 compute-1 python3.9[119379]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:29:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:04.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:04.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:05 compute-1 ceph-mon[80754]: pgmap v498: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:05 compute-1 sudo[119533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baznyvithfikfjczbkoziqogruxbsfvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397744.6627517-68-251347913199818/AnsiballZ_file.py'
Nov 29 06:29:05 compute-1 sudo[119533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:05 compute-1 python3.9[119535]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:05 compute-1 sudo[119533]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:06 compute-1 sudo[119685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzqwtwdjynoajekzbfcctnxdrcpmzzij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397745.7592094-68-163455034449344/AnsiballZ_file.py'
Nov 29 06:29:06 compute-1 sudo[119685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:06 compute-1 python3.9[119687]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:06 compute-1 sudo[119685]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:06.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:06 compute-1 ceph-mon[80754]: pgmap v499: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:29:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:07.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:29:07 compute-1 python3.9[119837]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:29:07 compute-1 sudo[119987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooywfrfcbuvigtkkrtegpqghxsjnxnjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397747.3522565-137-9227880004626/AnsiballZ_seboolean.py'
Nov 29 06:29:07 compute-1 sudo[119987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:07 compute-1 python3.9[119989]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 06:29:08 compute-1 ceph-mon[80754]: pgmap v500: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:08.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:09.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:10.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:11.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:11 compute-1 sshd-session[119994]: Invalid user saas from 71.70.164.48 port 45948
Nov 29 06:29:11 compute-1 sshd-session[119994]: Received disconnect from 71.70.164.48 port 45948:11: Bye Bye [preauth]
Nov 29 06:29:11 compute-1 sshd-session[119994]: Disconnected from invalid user saas 71.70.164.48 port 45948 [preauth]
Nov 29 06:29:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:29:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:12.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:29:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:13.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:13 compute-1 sudo[119987]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:13 compute-1 sudo[120146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvxpjonvaypcbbkanqzsnrfjgvczvrqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397753.41192-167-51716676155966/AnsiballZ_setup.py'
Nov 29 06:29:13 compute-1 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 29 06:29:13 compute-1 sudo[120146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:13 compute-1 ceph-mon[80754]: pgmap v501: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:14 compute-1 python3.9[120148]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:29:14 compute-1 sudo[120146]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:14.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:14 compute-1 ceph-mon[80754]: pgmap v502: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:14 compute-1 ceph-mon[80754]: pgmap v503: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:14 compute-1 sudo[120230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhmyzcxvfsfjjbwyxbefiohwrzapvfly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397753.41192-167-51716676155966/AnsiballZ_dnf.py'
Nov 29 06:29:14 compute-1 sudo[120230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:14 compute-1 python3.9[120232]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:29:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:15.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:16 compute-1 sudo[120230]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:16.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:16 compute-1 ceph-mon[80754]: pgmap v504: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:17.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:18 compute-1 sudo[120383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbbhnxlrmipuappdrohwfjfozyijwjmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397757.4854007-203-152519114548605/AnsiballZ_systemd.py'
Nov 29 06:29:18 compute-1 sudo[120383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:18 compute-1 python3.9[120385]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:29:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:18.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:18 compute-1 sudo[120383]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:19.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:19 compute-1 sudo[120538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqnacynnaktqasgyojwecjcxlorpgznc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397758.9351192-227-4621669053349/AnsiballZ_edpm_nftables_snippet.py'
Nov 29 06:29:19 compute-1 sudo[120538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:20.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:20 compute-1 python3[120540]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 29 06:29:20 compute-1 sudo[120538]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:20 compute-1 ceph-mon[80754]: pgmap v505: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:21.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:21 compute-1 sudo[120690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcthmcnfrbhfmhlauqcpujnakvpmgrly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397761.000959-254-216996444540244/AnsiballZ_file.py'
Nov 29 06:29:21 compute-1 sudo[120690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:21 compute-1 python3.9[120692]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:21 compute-1 sudo[120690]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:21 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:29:21 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 1338 writes, 9364 keys, 1338 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 1338 writes, 1338 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1338 writes, 9364 keys, 1338 commit groups, 1.0 writes per commit group, ingest: 19.42 MB, 0.03 MB/s
                                           Interval WAL: 1338 writes, 1338 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     36.2      0.31              0.03         3    0.102       0      0       0.0       0.0
                                             L6      1/0    9.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.7     49.4     44.5      0.42              0.07         2    0.211    8390    734       0.0       0.0
                                            Sum      1/0    9.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.7     28.6     41.0      0.73              0.10         5    0.145    8390    734       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.7     33.6     48.1      0.62              0.10         4    0.155    8390    734       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     49.4     44.5      0.42              0.07         2    0.211    8390    734       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     55.8      0.20              0.03         2    0.099       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.011, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.03 GB write, 0.05 MB/s write, 0.02 GB read, 0.03 MB/s read, 0.7 seconds
                                           Interval compaction: 0.03 GB write, 0.05 MB/s write, 0.02 GB read, 0.03 MB/s read, 0.6 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x562155f711f0#2 capacity: 304.00 MB usage: 806.88 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 5.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(36,700.22 KB,0.224937%) FilterBlock(5,31.98 KB,0.0102746%) IndexBlock(5,74.67 KB,0.0239874%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Nov 29 06:29:22 compute-1 ceph-mon[80754]: pgmap v506: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:22 compute-1 ceph-mon[80754]: pgmap v507: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:22 compute-1 sudo[120842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txyjdtkqgylzldjvkznbhychzbuzzpli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397761.706954-278-43030313549165/AnsiballZ_stat.py'
Nov 29 06:29:22 compute-1 sudo[120842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:22 compute-1 python3.9[120844]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:22 compute-1 sudo[120842]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:22.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:22 compute-1 sudo[120920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcoshsidnwskbeijmjkctkamczdyetkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397761.706954-278-43030313549165/AnsiballZ_file.py'
Nov 29 06:29:22 compute-1 sudo[120920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:22 compute-1 python3.9[120922]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:22 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:22 compute-1 sudo[120920]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:23.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:23 compute-1 sudo[121072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwdkhmhocmlexqotkninpbnwvczwqquw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397763.065149-314-237446759802022/AnsiballZ_stat.py'
Nov 29 06:29:23 compute-1 sudo[121072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:23 compute-1 python3.9[121074]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:23 compute-1 sudo[121072]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:24 compute-1 sudo[121150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgrwcqbzkbrshxdkbwvtzgxvqomzgmfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397763.065149-314-237446759802022/AnsiballZ_file.py'
Nov 29 06:29:24 compute-1 sudo[121150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:24 compute-1 ceph-mon[80754]: pgmap v508: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:24 compute-1 python3.9[121152]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.w9f0oyvv recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:24 compute-1 sudo[121150]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:24.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:24 compute-1 sudo[121302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxxjktkmunpfjkvecvjmhbtkeghmhrxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397764.541043-350-180706090505821/AnsiballZ_stat.py'
Nov 29 06:29:24 compute-1 sudo[121302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:25.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:25 compute-1 python3.9[121304]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:25 compute-1 sudo[121302]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:25 compute-1 sudo[121380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpshbgbvtknsrqduqlecotxheapnaplm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397764.541043-350-180706090505821/AnsiballZ_file.py'
Nov 29 06:29:25 compute-1 sudo[121380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:25 compute-1 python3.9[121382]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:25 compute-1 sudo[121380]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:26 compute-1 sudo[121532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzufkkblojnbpxnowfybjvyftlxpjpjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397765.8561304-389-143177924981584/AnsiballZ_command.py'
Nov 29 06:29:26 compute-1 sudo[121532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:26.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:27.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:27 compute-1 python3.9[121534]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:27 compute-1 sudo[121532]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:27 compute-1 sudo[121685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhssiozqeinnczotilvlptqbiyqipdop ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397767.481074-413-209123792737124/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:29:27 compute-1 sudo[121685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:27 compute-1 ceph-mon[80754]: pgmap v509: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:28 compute-1 python3[121687]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:29:28 compute-1 sudo[121685]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:29:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:28.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:29:28 compute-1 sudo[121837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhqbxsuyapbcokkfauquipvjcmxrtzyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397768.4213958-437-17887342573909/AnsiballZ_stat.py'
Nov 29 06:29:28 compute-1 sudo[121837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:28 compute-1 python3.9[121839]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:28 compute-1 sudo[121837]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:29.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:29 compute-1 sudo[121962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opnsadnpylyodworewqjzgkoxpmwticc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397768.4213958-437-17887342573909/AnsiballZ_copy.py'
Nov 29 06:29:29 compute-1 sudo[121962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:29 compute-1 ceph-mon[80754]: pgmap v510: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:29 compute-1 python3.9[121964]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397768.4213958-437-17887342573909/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:29 compute-1 sudo[121962]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:30 compute-1 sudo[122114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkyoengkagqtijabepqtmfqkglkijguu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397770.1671193-482-68565251634054/AnsiballZ_stat.py'
Nov 29 06:29:30 compute-1 sudo[122114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:30.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:30 compute-1 python3.9[122116]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:30 compute-1 sudo[122114]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:31.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:31 compute-1 sudo[122239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkwvpgqfeukzwprekxdnvooseupynkfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397770.1671193-482-68565251634054/AnsiballZ_copy.py'
Nov 29 06:29:31 compute-1 sudo[122239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:31 compute-1 python3.9[122241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397770.1671193-482-68565251634054/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:31 compute-1 sudo[122239]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:32 compute-1 sudo[122392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuyvvjlkecvsgbekgenavtodbetwoocg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397772.0004644-527-166895828604484/AnsiballZ_stat.py'
Nov 29 06:29:32 compute-1 sudo[122392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:32.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:32 compute-1 python3.9[122394]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:32 compute-1 sudo[122392]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:32 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:33 compute-1 sudo[122517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfrkwttscmzgbeiayxvznrnwpsjfakdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397772.0004644-527-166895828604484/AnsiballZ_copy.py'
Nov 29 06:29:33 compute-1 sudo[122517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:33.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:33 compute-1 python3.9[122519]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397772.0004644-527-166895828604484/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:33 compute-1 sudo[122517]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:33 compute-1 ceph-mon[80754]: pgmap v511: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:34 compute-1 sudo[122669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tczwfqherokoyciinulwcmicmbgoszzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397773.744868-572-108463550360372/AnsiballZ_stat.py'
Nov 29 06:29:34 compute-1 sudo[122669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:34 compute-1 python3.9[122671]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:34 compute-1 sudo[122669]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:34.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:34 compute-1 sudo[122794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvfojqncrenhlvbxsgfkjmtzxziagohh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397773.744868-572-108463550360372/AnsiballZ_copy.py'
Nov 29 06:29:34 compute-1 sudo[122794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:34 compute-1 python3.9[122796]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397773.744868-572-108463550360372/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:34 compute-1 sudo[122794]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:34 compute-1 ceph-mon[80754]: pgmap v512: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:34 compute-1 ceph-mon[80754]: pgmap v513: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:35.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:35 compute-1 sudo[122946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klxlbxhblcrdfascpktofjghlzhovfgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397775.5084586-617-211631853474894/AnsiballZ_stat.py'
Nov 29 06:29:35 compute-1 sudo[122946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:36 compute-1 python3.9[122948]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:36 compute-1 sudo[122946]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:36 compute-1 sudo[123071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnzujdgqdcrzjymkpzuxvlcmwxidoxbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397775.5084586-617-211631853474894/AnsiballZ_copy.py'
Nov 29 06:29:36 compute-1 sudo[123071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:36.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:36 compute-1 python3.9[123073]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397775.5084586-617-211631853474894/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:36 compute-1 sudo[123071]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:37.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:37 compute-1 sudo[123223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cznvvuzfbkwyquipbryzhfnteihmupdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397777.2677684-662-270387048308081/AnsiballZ_file.py'
Nov 29 06:29:37 compute-1 sudo[123223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:37 compute-1 python3.9[123225]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:37 compute-1 sudo[123223]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:37 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:38 compute-1 sudo[123375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgbnqviibteohpbarepzkzmryrvfmdnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397778.1488638-686-66773278443258/AnsiballZ_command.py'
Nov 29 06:29:38 compute-1 sudo[123375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:38.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:38 compute-1 ceph-mon[80754]: pgmap v514: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:38 compute-1 python3.9[123377]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:38 compute-1 sudo[123375]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:39.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:39 compute-1 sudo[123530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azapetddqitathcdnkpsqhgvbwbyjfaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397778.9381704-710-12340795334524/AnsiballZ_blockinfile.py'
Nov 29 06:29:39 compute-1 sudo[123530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:39 compute-1 python3.9[123532]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:39 compute-1 sudo[123530]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:40 compute-1 sudo[123682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqtvboencgxhzxlhgumujkikybvgunqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397780.0794551-737-133583920969413/AnsiballZ_command.py'
Nov 29 06:29:40 compute-1 sudo[123682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:40.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:40 compute-1 python3.9[123684]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:40 compute-1 sudo[123682]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:29:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:41.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:29:41 compute-1 sudo[123835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ladnlhwegixbydhmnlgiztqbxloizwpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397780.8680842-761-267478183086991/AnsiballZ_stat.py'
Nov 29 06:29:41 compute-1 sudo[123835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:41 compute-1 python3.9[123837]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:29:41 compute-1 sudo[123835]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:42 compute-1 sudo[123864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:42 compute-1 sudo[123864]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:42 compute-1 sudo[123864]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:42 compute-1 sudo[123889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:29:42 compute-1 sudo[123889]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:42 compute-1 sudo[123889]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:42 compute-1 ceph-mon[80754]: pgmap v515: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:42 compute-1 ceph-mon[80754]: pgmap v516: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:42 compute-1 sudo[123917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:29:42 compute-1 sudo[123917]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:42 compute-1 sudo[123917]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:42 compute-1 sudo[123962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:29:42 compute-1 sudo[123962]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:29:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:42.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:42 compute-1 sudo[124106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvdhanjakhafkhehqgjrowclyiyczzog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397782.2982776-785-12478056684415/AnsiballZ_command.py'
Nov 29 06:29:42 compute-1 sudo[124106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:42 compute-1 sudo[123962]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:42 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:42 compute-1 python3.9[124108]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:42 compute-1 sudo[124106]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:43.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:43 compute-1 sudo[124275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yonouqwvcegvcjvfnulxesvqxfazvvgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397783.2177968-809-188536300091401/AnsiballZ_file.py'
Nov 29 06:29:43 compute-1 sudo[124275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:43 compute-1 python3.9[124277]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:43 compute-1 sudo[124275]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:43 compute-1 sshd-session[122365]: Connection closed by 119.45.242.7 port 40718 [preauth]
Nov 29 06:29:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:44.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:45.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:45 compute-1 ceph-mon[80754]: pgmap v517: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:45 compute-1 ceph-mon[80754]: pgmap v518: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:45 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 06:29:45 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 06:29:46 compute-1 python3.9[124428]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:29:46 compute-1 ceph-mon[80754]: pgmap v519: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:29:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:46.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:47.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:47 compute-1 ceph-mon[80754]: pgmap v520: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Nov 29 06:29:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:29:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:48.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:29:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:49.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:29:49 compute-1 ceph-mon[80754]: pgmap v521: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:49 compute-1 sudo[124579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xppmkkkpeybzadkrgfpwsulqckudxpzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397788.9807503-929-52145987695025/AnsiballZ_command.py'
Nov 29 06:29:49 compute-1 sudo[124579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:49 compute-1 python3.9[124581]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:49 compute-1 ovs-vsctl[124582]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 29 06:29:49 compute-1 sudo[124579]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:50 compute-1 sudo[124732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsesodflmsxmbyxzabssvgxzjohxawmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397790.0897717-956-157740566371539/AnsiballZ_command.py'
Nov 29 06:29:50 compute-1 sudo[124732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:50.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:50 compute-1 python3.9[124734]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:50 compute-1 sudo[124732]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:29:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:51.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:29:51 compute-1 ceph-mon[80754]: pgmap v522: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:52.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:53 compute-1 sudo[124887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idnufbzsbueipfqqgarhwcucyoxsmsql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397792.717424-980-87305121357008/AnsiballZ_command.py'
Nov 29 06:29:53 compute-1 sudo[124887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:53.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:53 compute-1 python3.9[124889]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:29:53 compute-1 ovs-vsctl[124890]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 29 06:29:53 compute-1 sudo[124887]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:53 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:29:53 compute-1 ceph-mon[80754]: pgmap v523: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:53 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:29:54 compute-1 python3.9[125040]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:29:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:29:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:29:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:29:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:29:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:29:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:29:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:54.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:54 compute-1 sudo[125192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvmmejhbxlsijeexfjpwjqjilwflobmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397794.5729322-1031-269589866362184/AnsiballZ_file.py'
Nov 29 06:29:54 compute-1 sudo[125192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:55 compute-1 python3.9[125194]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:55.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:55 compute-1 sudo[125192]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:55 compute-1 ceph-mon[80754]: pgmap v524: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:29:55 compute-1 sudo[125346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipyveumbhsdmjvxwsbpugqkrejrtvnwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397795.3924425-1055-278035085226431/AnsiballZ_stat.py'
Nov 29 06:29:55 compute-1 sudo[125346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:55 compute-1 python3.9[125348]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:56 compute-1 sudo[125346]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:56 compute-1 sshd-session[125195]: Received disconnect from 118.194.230.250 port 50992:11: Bye Bye [preauth]
Nov 29 06:29:56 compute-1 sshd-session[125195]: Disconnected from authenticating user root 118.194.230.250 port 50992 [preauth]
Nov 29 06:29:56 compute-1 sudo[125424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebfompxtkforvefcvufccsamnusqfyen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397795.3924425-1055-278035085226431/AnsiballZ_file.py'
Nov 29 06:29:56 compute-1 sudo[125424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:56 compute-1 python3.9[125426]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:56 compute-1 sudo[125424]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:56.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:56 compute-1 sudo[125576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ronekhgzmaqxgqexcinlgowzprfsyust ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397796.63981-1055-29245409909232/AnsiballZ_stat.py'
Nov 29 06:29:56 compute-1 sudo[125576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:57.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:57 compute-1 python3.9[125578]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:57 compute-1 sudo[125576]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:57 compute-1 sudo[125654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uirpfkpznmgkbnhjoyvnzsoopzjkbngl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397796.63981-1055-29245409909232/AnsiballZ_file.py'
Nov 29 06:29:57 compute-1 sudo[125654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:57 compute-1 python3.9[125656]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:57 compute-1 sudo[125654]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:58 compute-1 sudo[125806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orqwmptmgebnzknoqsnhcvaidmdcyowd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397798.134613-1124-262605505238463/AnsiballZ_file.py'
Nov 29 06:29:58 compute-1 sudo[125806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:29:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:58.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:29:58 compute-1 python3.9[125808]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:58 compute-1 sudo[125806]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:29:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:29:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:59.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:29:59 compute-1 sudo[125958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muvlblcizcfrabrckwjjgdyewmzsozuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397799.0400822-1148-174246530724413/AnsiballZ_stat.py'
Nov 29 06:29:59 compute-1 sudo[125958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:29:59 compute-1 python3.9[125960]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:59 compute-1 sudo[125958]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:00 compute-1 sudo[126036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpqmydbbaknstahkxcrhoxfkobvkjooy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397799.0400822-1148-174246530724413/AnsiballZ_file.py'
Nov 29 06:30:00 compute-1 sudo[126036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:00 compute-1 python3.9[126038]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:00 compute-1 sudo[126036]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:00.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:00 compute-1 ceph-mon[80754]: pgmap v525: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:00 compute-1 sudo[126188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzgmsrakrzvuxyckjgzpanpjryxalije ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397800.5521436-1184-13123727390725/AnsiballZ_stat.py'
Nov 29 06:30:00 compute-1 sudo[126188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:01 compute-1 python3.9[126190]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:01.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:01 compute-1 sudo[126188]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:01 compute-1 sudo[126266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eesznyyvfqhocycgossxfbwieowlgbyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397800.5521436-1184-13123727390725/AnsiballZ_file.py'
Nov 29 06:30:01 compute-1 sudo[126266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:01 compute-1 ceph-mon[80754]: pgmap v526: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:01 compute-1 ceph-mon[80754]: overall HEALTH_OK
Nov 29 06:30:01 compute-1 python3.9[126268]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:01 compute-1 sudo[126266]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:02 compute-1 sudo[126418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koibtgraifinyqdqnysnfynylppldosc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397801.8426657-1220-232885820481015/AnsiballZ_systemd.py'
Nov 29 06:30:02 compute-1 sudo[126418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:02 compute-1 python3.9[126420]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:30:02 compute-1 systemd[1]: Reloading.
Nov 29 06:30:02 compute-1 systemd-rc-local-generator[126449]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:02 compute-1 systemd-sysv-generator[126452]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:02.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:02 compute-1 sudo[126418]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:03.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:03 compute-1 sudo[126609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqzcdgvyttcodvoltvgcafenrmxupjld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397803.0674977-1244-77261072914407/AnsiballZ_stat.py'
Nov 29 06:30:03 compute-1 sudo[126609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:03 compute-1 python3.9[126611]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:03 compute-1 sudo[126609]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:04 compute-1 ceph-mon[80754]: pgmap v527: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:04 compute-1 sudo[126687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyemqwlektoxiokcujxwqexjnsqsmgik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397803.0674977-1244-77261072914407/AnsiballZ_file.py'
Nov 29 06:30:04 compute-1 sudo[126687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:04 compute-1 python3.9[126689]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:04 compute-1 sudo[126687]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:04.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:04 compute-1 sudo[126839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgnslbnlthaoyxxsimcaoehpcrdxjdtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397804.4996254-1280-115085448293309/AnsiballZ_stat.py'
Nov 29 06:30:04 compute-1 sudo[126839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:04 compute-1 python3.9[126841]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:05 compute-1 sudo[126839]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:05.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:05 compute-1 ceph-mon[80754]: pgmap v528: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:05 compute-1 sudo[126919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmftybrtqfzjgipczkccmqbutwkgzdrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397804.4996254-1280-115085448293309/AnsiballZ_file.py'
Nov 29 06:30:05 compute-1 sudo[126919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:05 compute-1 python3.9[126921]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:05 compute-1 sudo[126919]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:06 compute-1 sudo[127071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpocmxeoaampedqapvthzokjesruledp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397805.7444358-1316-192905168035550/AnsiballZ_systemd.py'
Nov 29 06:30:06 compute-1 sudo[127071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:06 compute-1 python3.9[127073]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:30:06 compute-1 systemd[1]: Reloading.
Nov 29 06:30:06 compute-1 ceph-mon[80754]: pgmap v529: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:06 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:30:06 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:30:06 compute-1 systemd-rc-local-generator[127103]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:06 compute-1 systemd-sysv-generator[127106]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:06.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:06 compute-1 sudo[127111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:30:06 compute-1 sudo[127111]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:30:06 compute-1 sudo[127111]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:06 compute-1 systemd[1]: Starting Create netns directory...
Nov 29 06:30:06 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:30:06 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:30:06 compute-1 systemd[1]: Finished Create netns directory.
Nov 29 06:30:06 compute-1 sudo[127138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:30:06 compute-1 sudo[127138]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:30:06 compute-1 sudo[127138]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:06 compute-1 sudo[127071]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:07.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:07 compute-1 sudo[127318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqsidqkokhdompwqapwhxfvnoraztkxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397807.3503802-1347-76679259796521/AnsiballZ_file.py'
Nov 29 06:30:07 compute-1 sudo[127318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:07 compute-1 python3.9[127320]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:07 compute-1 sudo[127318]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:08 compute-1 sudo[127470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecqtxuhmrlhtdktgjfdnzkwjaxhexurf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397808.1312945-1370-242769842265489/AnsiballZ_stat.py'
Nov 29 06:30:08 compute-1 sudo[127470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:08.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:08 compute-1 python3.9[127472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:08 compute-1 sudo[127470]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:09 compute-1 sudo[127593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbcnfzwttsgrbvuocpjauwxtsmluvlky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397808.1312945-1370-242769842265489/AnsiballZ_copy.py'
Nov 29 06:30:09 compute-1 sudo[127593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:09.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:09 compute-1 python3.9[127595]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397808.1312945-1370-242769842265489/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:09 compute-1 sudo[127593]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:10 compute-1 sudo[127747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quiwzfmlxsysnexjhuxywkgyndxxboul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397809.8224123-1421-224578607234296/AnsiballZ_file.py'
Nov 29 06:30:10 compute-1 sudo[127747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:10 compute-1 python3.9[127749]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:10 compute-1 sudo[127747]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000054s ======
Nov 29 06:30:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:10.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Nov 29 06:30:10 compute-1 sudo[127899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttwflxgdiwxdiiacvsvnvmotkbndmrkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397810.6662323-1445-246328890831847/AnsiballZ_stat.py'
Nov 29 06:30:10 compute-1 sudo[127899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:11.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:11 compute-1 python3.9[127901]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:11 compute-1 sudo[127899]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:11 compute-1 sudo[128022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoxjtxjtobtraetdivtipknrpaqllvfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397810.6662323-1445-246328890831847/AnsiballZ_copy.py'
Nov 29 06:30:11 compute-1 sudo[128022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:11 compute-1 python3.9[128024]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397810.6662323-1445-246328890831847/.source.json _original_basename=.gfa_p6x1 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:11 compute-1 sudo[128022]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:12 compute-1 sudo[128174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcpeuqfxfrwdzhcffoomjcsgtukqyakc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397812.15986-1490-262029189597274/AnsiballZ_file.py'
Nov 29 06:30:12 compute-1 sudo[128174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:12.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:12 compute-1 python3.9[128176]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:12 compute-1 sudo[128174]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:12 compute-1 sshd-session[127689]: Invalid user aaa from 66.94.122.234 port 51800
Nov 29 06:30:12 compute-1 sshd-session[127689]: Received disconnect from 66.94.122.234 port 51800:11: Bye Bye [preauth]
Nov 29 06:30:12 compute-1 sshd-session[127689]: Disconnected from invalid user aaa 66.94.122.234 port 51800 [preauth]
Nov 29 06:30:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:13.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:13 compute-1 sudo[128326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uylezyszvbtrxbxirtwdwepyuioyabow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397813.0330582-1514-38830674280174/AnsiballZ_stat.py'
Nov 29 06:30:13 compute-1 sudo[128326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:13 compute-1 sudo[128326]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:13 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:30:13 compute-1 sudo[128449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yszedzgdsrsemyjzayvdokussuuhhtga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397813.0330582-1514-38830674280174/AnsiballZ_copy.py'
Nov 29 06:30:13 compute-1 sudo[128449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:14 compute-1 sudo[128449]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:14 compute-1 ceph-mon[80754]: pgmap v530: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:14.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:15.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:15 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:15 compute-1 sudo[128601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgouircssudwmkvisuewmpwwbsatgwkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397814.9240222-1565-126725559340710/AnsiballZ_container_config_data.py'
Nov 29 06:30:15 compute-1 sudo[128601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:15 compute-1 python3.9[128603]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 29 06:30:15 compute-1 sudo[128601]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:16 compute-1 sudo[128753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znmlwokomhdnbkfsfpnmxqaxolvuzhzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397815.9608803-1592-246715448022038/AnsiballZ_container_config_hash.py'
Nov 29 06:30:16 compute-1 sudo[128753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:16 compute-1 python3.9[128755]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:30:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:16.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:16 compute-1 sudo[128753]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:17.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:17 compute-1 ceph-mon[80754]: pgmap v531: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:17 compute-1 ceph-mon[80754]: pgmap v532: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:17 compute-1 ceph-mon[80754]: pgmap v533: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:17 compute-1 ceph-mon[80754]: pgmap v534: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:17 compute-1 sudo[128905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrcwecwuvfcsqsmyvmmqmsxupjswjzif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397817.031021-1619-204591526366128/AnsiballZ_podman_container_info.py'
Nov 29 06:30:17 compute-1 sudo[128905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:17 compute-1 python3.9[128907]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 06:30:17 compute-1 sudo[128905]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:18 compute-1 ceph-mon[80754]: pgmap v535: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:18.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:19.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:19 compute-1 sudo[129084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ronswftmrkhcqpuawilhbkjkvhqhrrbr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397818.9637022-1658-146327080065223/AnsiballZ_edpm_container_manage.py'
Nov 29 06:30:19 compute-1 sudo[129084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:19 compute-1 ceph-mon[80754]: pgmap v536: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:19 compute-1 python3[129086]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:30:20 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:20.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:21.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:21 compute-1 ceph-mon[80754]: pgmap v537: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:22.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:22 compute-1 ceph-mon[80754]: pgmap v538: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:23.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:24.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:25.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:26 compute-1 podman[129099]: 2025-11-29 06:30:26.043950194 +0000 UTC m=+6.150896444 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 06:30:26 compute-1 podman[129219]: 2025-11-29 06:30:26.200024555 +0000 UTC m=+0.054483990 container create e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 06:30:26 compute-1 podman[129219]: 2025-11-29 06:30:26.176970204 +0000 UTC m=+0.031429659 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 06:30:26 compute-1 python3[129086]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 06:30:26 compute-1 sudo[129084]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:26.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:27.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:28 compute-1 ceph-mon[80754]: pgmap v539: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:28.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:29.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:29 compute-1 ceph-mon[80754]: pgmap v540: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:29 compute-1 ceph-mon[80754]: pgmap v541: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:29 compute-1 sshd-session[129047]: error: kex_exchange_identification: read: Connection timed out
Nov 29 06:30:29 compute-1 sshd-session[129047]: banner exchange: Connection from 119.45.242.7 port 51482: Connection timed out
Nov 29 06:30:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:30 compute-1 sudo[129407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onyalmlcskmdbqbzvplwhxrsrmzqwwtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397830.2059288-1682-217231872564604/AnsiballZ_stat.py'
Nov 29 06:30:30 compute-1 sudo[129407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:30.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:30 compute-1 python3.9[129409]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:30:30 compute-1 sudo[129407]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:31.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:31 compute-1 sudo[129561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wglkihsvdcwwesvcmqqigcthgtigthco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397831.0155554-1709-81159089965401/AnsiballZ_file.py'
Nov 29 06:30:31 compute-1 sudo[129561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:31 compute-1 python3.9[129563]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:31 compute-1 sudo[129561]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:32 compute-1 sudo[129637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qstplcdvltfpqlgqkjpfxvnxgaenezjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397831.0155554-1709-81159089965401/AnsiballZ_stat.py'
Nov 29 06:30:32 compute-1 sudo[129637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:32.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:32 compute-1 python3.9[129639]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:30:32 compute-1 sudo[129637]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:33.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:33 compute-1 sudo[129793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pophacqcypduyxzokpkgcnbicagegpwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397832.9650795-1709-213072460580755/AnsiballZ_copy.py'
Nov 29 06:30:33 compute-1 sudo[129793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:33 compute-1 python3.9[129795]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397832.9650795-1709-213072460580755/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:33 compute-1 sudo[129793]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:34 compute-1 sudo[129869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-risisjgwfbfosxizcuqesiywcfcefzvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397832.9650795-1709-213072460580755/AnsiballZ_systemd.py'
Nov 29 06:30:34 compute-1 sudo[129869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:34 compute-1 python3.9[129871]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:30:34 compute-1 systemd[1]: Reloading.
Nov 29 06:30:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:34.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:34 compute-1 systemd-rc-local-generator[129898]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:34 compute-1 systemd-sysv-generator[129902]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:34 compute-1 sudo[129869]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:35.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:35 compute-1 sudo[129981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cofuojfecizwpielryhoedpkmikypzwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397832.9650795-1709-213072460580755/AnsiballZ_systemd.py'
Nov 29 06:30:35 compute-1 sudo[129981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:35 compute-1 python3.9[129983]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:30:35 compute-1 systemd[1]: Reloading.
Nov 29 06:30:35 compute-1 systemd-rc-local-generator[130012]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:35 compute-1 systemd-sysv-generator[130015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:35 compute-1 systemd[1]: Starting ovn_controller container...
Nov 29 06:30:35 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:30:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d06215a7c9e2f4d808980f1813b1b8a04e986648f3584f9f2e3ba032b924b4a8/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 06:30:36 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275.
Nov 29 06:30:36 compute-1 podman[130024]: 2025-11-29 06:30:36.03736806 +0000 UTC m=+0.124107610 container init e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 29 06:30:36 compute-1 ovn_controller[130039]: + sudo -E kolla_set_configs
Nov 29 06:30:36 compute-1 podman[130024]: 2025-11-29 06:30:36.065601973 +0000 UTC m=+0.152341433 container start e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 06:30:36 compute-1 edpm-start-podman-container[130024]: ovn_controller
Nov 29 06:30:36 compute-1 systemd[1]: Created slice User Slice of UID 0.
Nov 29 06:30:36 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 29 06:30:36 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 29 06:30:36 compute-1 systemd[1]: Starting User Manager for UID 0...
Nov 29 06:30:36 compute-1 edpm-start-podman-container[130023]: Creating additional drop-in dependency for "ovn_controller" (e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275)
Nov 29 06:30:36 compute-1 systemd[130077]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Nov 29 06:30:36 compute-1 podman[130046]: 2025-11-29 06:30:36.150551425 +0000 UTC m=+0.074857051 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 06:30:36 compute-1 systemd[1]: e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275-5d0a1a16db78e533.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:30:36 compute-1 systemd[1]: e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275-5d0a1a16db78e533.service: Failed with result 'exit-code'.
Nov 29 06:30:36 compute-1 systemd[1]: Reloading.
Nov 29 06:30:36 compute-1 systemd-rc-local-generator[130123]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:36 compute-1 systemd-sysv-generator[130128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:36 compute-1 systemd[130077]: Queued start job for default target Main User Target.
Nov 29 06:30:36 compute-1 systemd[130077]: Created slice User Application Slice.
Nov 29 06:30:36 compute-1 systemd[130077]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 29 06:30:36 compute-1 systemd[130077]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 06:30:36 compute-1 systemd[130077]: Reached target Paths.
Nov 29 06:30:36 compute-1 systemd[130077]: Reached target Timers.
Nov 29 06:30:36 compute-1 systemd[130077]: Starting D-Bus User Message Bus Socket...
Nov 29 06:30:36 compute-1 systemd[130077]: Starting Create User's Volatile Files and Directories...
Nov 29 06:30:36 compute-1 systemd[130077]: Listening on D-Bus User Message Bus Socket.
Nov 29 06:30:36 compute-1 systemd[130077]: Reached target Sockets.
Nov 29 06:30:36 compute-1 systemd[130077]: Finished Create User's Volatile Files and Directories.
Nov 29 06:30:36 compute-1 systemd[130077]: Reached target Basic System.
Nov 29 06:30:36 compute-1 systemd[130077]: Reached target Main User Target.
Nov 29 06:30:36 compute-1 systemd[130077]: Startup finished in 168ms.
Nov 29 06:30:36 compute-1 systemd[1]: Started User Manager for UID 0.
Nov 29 06:30:36 compute-1 systemd[1]: Started ovn_controller container.
Nov 29 06:30:36 compute-1 systemd[1]: Started Session c1 of User root.
Nov 29 06:30:36 compute-1 sudo[129981]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:36 compute-1 ovn_controller[130039]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:30:36 compute-1 ovn_controller[130039]: INFO:__main__:Validating config file
Nov 29 06:30:36 compute-1 ovn_controller[130039]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:30:36 compute-1 ovn_controller[130039]: INFO:__main__:Writing out command to execute
Nov 29 06:30:36 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 29 06:30:36 compute-1 ovn_controller[130039]: ++ cat /run_command
Nov 29 06:30:36 compute-1 ovn_controller[130039]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 06:30:36 compute-1 ovn_controller[130039]: + ARGS=
Nov 29 06:30:36 compute-1 ovn_controller[130039]: + sudo kolla_copy_cacerts
Nov 29 06:30:36 compute-1 systemd[1]: Started Session c2 of User root.
Nov 29 06:30:36 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 29 06:30:36 compute-1 ovn_controller[130039]: + [[ ! -n '' ]]
Nov 29 06:30:36 compute-1 ovn_controller[130039]: + . kolla_extend_start
Nov 29 06:30:36 compute-1 ovn_controller[130039]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 06:30:36 compute-1 ovn_controller[130039]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 29 06:30:36 compute-1 ovn_controller[130039]: + umask 0022
Nov 29 06:30:36 compute-1 ovn_controller[130039]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 29 06:30:36 compute-1 NetworkManager[49015]: <info>  [1764397836.6425] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Nov 29 06:30:36 compute-1 NetworkManager[49015]: <info>  [1764397836.6432] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:30:36 compute-1 NetworkManager[49015]: <info>  [1764397836.6443] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 29 06:30:36 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:30:36 compute-1 NetworkManager[49015]: <info>  [1764397836.6448] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Nov 29 06:30:36 compute-1 NetworkManager[49015]: <info>  [1764397836.6451] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 06:30:36 compute-1 kernel: br-int: entered promiscuous mode
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 29 06:30:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:36.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:30:36 compute-1 ovn_controller[130039]: 2025-11-29T06:30:36Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:30:36 compute-1 NetworkManager[49015]: <info>  [1764397836.6755] manager: (ovn-e15f55-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 29 06:30:36 compute-1 systemd-udevd[130173]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:30:36 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Nov 29 06:30:36 compute-1 NetworkManager[49015]: <info>  [1764397836.6955] device (genev_sys_6081): carrier: link connected
Nov 29 06:30:36 compute-1 NetworkManager[49015]: <info>  [1764397836.6957] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Nov 29 06:30:36 compute-1 systemd-udevd[130175]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:30:36 compute-1 ceph-mon[80754]: pgmap v542: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:37.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:38.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:39.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:39 compute-1 NetworkManager[49015]: <info>  [1764397839.5403] manager: (ovn-fa6f2e-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Nov 29 06:30:39 compute-1 ceph-mon[80754]: pgmap v543: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:39 compute-1 ceph-mon[80754]: pgmap v544: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:39 compute-1 ceph-mon[80754]: pgmap v545: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:40 compute-1 NetworkManager[49015]: <info>  [1764397840.0786] manager: (ovn-93db78-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Nov 29 06:30:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:40 compute-1 ceph-mon[80754]: pgmap v546: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:40 compute-1 sudo[130304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpqimzadwvptajlozxeykvyaujoazudk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397840.118529-1793-110045700163996/AnsiballZ_command.py'
Nov 29 06:30:40 compute-1 sudo[130304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:40 compute-1 python3.9[130306]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:40 compute-1 ovs-vsctl[130307]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 29 06:30:40 compute-1 sudo[130304]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:40.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:41.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:41 compute-1 sudo[130457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxkqpvstzgwrbfazxuziginkrrcfzaew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397840.91843-1817-120110696335086/AnsiballZ_command.py'
Nov 29 06:30:41 compute-1 sudo[130457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:41 compute-1 python3.9[130459]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:41 compute-1 ovs-vsctl[130461]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 29 06:30:41 compute-1 sudo[130457]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:42 compute-1 ceph-mon[80754]: pgmap v547: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:42 compute-1 sudo[130612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqtzovzkdpzdxjccrknmwctedevbplpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397842.095163-1859-143058925559551/AnsiballZ_command.py'
Nov 29 06:30:42 compute-1 sudo[130612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:42.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:42 compute-1 python3.9[130614]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:42 compute-1 ovs-vsctl[130615]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 29 06:30:42 compute-1 sudo[130612]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:43.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:43 compute-1 sshd-session[119229]: Connection closed by 192.168.122.30 port 57718
Nov 29 06:30:43 compute-1 sshd-session[119226]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:30:43 compute-1 systemd[1]: session-45.scope: Deactivated successfully.
Nov 29 06:30:43 compute-1 systemd[1]: session-45.scope: Consumed 1min 244ms CPU time.
Nov 29 06:30:43 compute-1 systemd-logind[785]: Session 45 logged out. Waiting for processes to exit.
Nov 29 06:30:43 compute-1 systemd-logind[785]: Removed session 45.
Nov 29 06:30:43 compute-1 ceph-mon[80754]: pgmap v548: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:44.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:45.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:46 compute-1 ceph-mon[80754]: pgmap v549: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:46.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:46 compute-1 systemd[1]: Stopping User Manager for UID 0...
Nov 29 06:30:46 compute-1 systemd[130077]: Activating special unit Exit the Session...
Nov 29 06:30:46 compute-1 systemd[130077]: Stopped target Main User Target.
Nov 29 06:30:46 compute-1 systemd[130077]: Stopped target Basic System.
Nov 29 06:30:46 compute-1 systemd[130077]: Stopped target Paths.
Nov 29 06:30:46 compute-1 systemd[130077]: Stopped target Sockets.
Nov 29 06:30:46 compute-1 systemd[130077]: Stopped target Timers.
Nov 29 06:30:46 compute-1 systemd[130077]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 06:30:46 compute-1 systemd[130077]: Closed D-Bus User Message Bus Socket.
Nov 29 06:30:46 compute-1 systemd[130077]: Stopped Create User's Volatile Files and Directories.
Nov 29 06:30:46 compute-1 systemd[130077]: Removed slice User Application Slice.
Nov 29 06:30:46 compute-1 systemd[130077]: Reached target Shutdown.
Nov 29 06:30:46 compute-1 systemd[130077]: Finished Exit the Session.
Nov 29 06:30:46 compute-1 systemd[130077]: Reached target Exit the Session.
Nov 29 06:30:46 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Nov 29 06:30:46 compute-1 systemd[1]: Stopped User Manager for UID 0.
Nov 29 06:30:46 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 29 06:30:46 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 29 06:30:46 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 29 06:30:46 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 29 06:30:46 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Nov 29 06:30:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:47.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:47 compute-1 ovn_controller[130039]: 2025-11-29T06:30:47Z|00025|memory|INFO|15872 kB peak resident set size after 11.0 seconds
Nov 29 06:30:47 compute-1 ovn_controller[130039]: 2025-11-29T06:30:47Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Nov 29 06:30:48 compute-1 ceph-mon[80754]: pgmap v550: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:48 compute-1 sshd-session[130643]: Accepted publickey for zuul from 192.168.122.30 port 38708 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:30:48 compute-1 systemd-logind[785]: New session 47 of user zuul.
Nov 29 06:30:48 compute-1 systemd[1]: Started Session 47 of User zuul.
Nov 29 06:30:48 compute-1 sshd-session[130643]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:30:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:48.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:49.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:49 compute-1 python3.9[130796]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:30:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:50.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:51.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:52 compute-1 sudo[130950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcoxehobrktehrbhnhteeoxdclpxgzpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397852.1056304-68-21221254550288/AnsiballZ_file.py'
Nov 29 06:30:52 compute-1 sudo[130950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:52.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:52 compute-1 python3.9[130952]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:52 compute-1 sudo[130950]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:30:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:53.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:30:53 compute-1 sudo[131102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejckqvpthejsabkrgebtfcftkartehpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397853.0126317-68-266744705148862/AnsiballZ_file.py'
Nov 29 06:30:53 compute-1 sudo[131102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:53 compute-1 python3.9[131104]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:53 compute-1 sudo[131102]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:53 compute-1 ceph-mon[80754]: pgmap v551: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:53 compute-1 ceph-mon[80754]: pgmap v552: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:53 compute-1 ceph-mon[80754]: pgmap v553: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:54 compute-1 sudo[131254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiogwsxzsmvzcjgzgkgdhhobtbjitctn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397853.742237-68-33830675535006/AnsiballZ_file.py'
Nov 29 06:30:54 compute-1 sudo[131254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:54 compute-1 python3.9[131256]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:54 compute-1 sudo[131254]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:54.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:54 compute-1 sudo[131407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibiueapmpufapjzzrxwuhzydxoxcelrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397854.476549-68-166464344533538/AnsiballZ_file.py'
Nov 29 06:30:54 compute-1 sudo[131407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:54 compute-1 python3.9[131409]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:54 compute-1 sudo[131407]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:55.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:55 compute-1 ceph-mon[80754]: pgmap v554: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:30:55 compute-1 sudo[131559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoyuctgnzddpwcnzjxjzqrdeykqgfquq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397855.1577544-68-98409629298299/AnsiballZ_file.py'
Nov 29 06:30:55 compute-1 sudo[131559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:55 compute-1 python3.9[131561]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:55 compute-1 sudo[131559]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:56 compute-1 sshd-session[130640]: error: kex_exchange_identification: read: Connection timed out
Nov 29 06:30:56 compute-1 sshd-session[130640]: banner exchange: Connection from 14.103.107.234 port 36470: Connection timed out
Nov 29 06:30:56 compute-1 python3.9[131711]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:30:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:30:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:56.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:30:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:57.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:57 compute-1 sudo[131861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiguefiqizwdommesqffcznsqsshwmck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397856.8334918-200-120612109329800/AnsiballZ_seboolean.py'
Nov 29 06:30:57 compute-1 sudo[131861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:57 compute-1 python3.9[131863]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 06:30:57 compute-1 ceph-mon[80754]: pgmap v555: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:58 compute-1 sudo[131861]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:58.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:59 compute-1 python3.9[132013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:30:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:30:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:59.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:30:59 compute-1 ceph-mon[80754]: pgmap v556: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:30:59 compute-1 python3.9[132134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397858.4564419-224-157402102927445/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:00 compute-1 python3.9[132284]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:00.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:01 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Nov 29 06:31:01 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Nov 29 06:31:01 compute-1 python3.9[132405]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397859.992953-269-263644126277502/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:01.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:01 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Nov 29 06:31:01 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Nov 29 06:31:01 compute-1 ceph-mon[80754]: pgmap v557: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:01 compute-1 sudo[132555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhhcmxgsoyvazlmerfwuzvdaqtbrbsmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397861.57518-320-201229511462574/AnsiballZ_setup.py'
Nov 29 06:31:01 compute-1 sudo[132555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:02 compute-1 python3.9[132557]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:31:02 compute-1 sudo[132555]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:02.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:02 compute-1 sudo[132639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pampmpabdaveeblwpwcmdrzjnqkstzri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397861.57518-320-201229511462574/AnsiballZ_dnf.py'
Nov 29 06:31:02 compute-1 sudo[132639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:03 compute-1 python3.9[132641]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:31:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:03.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:03 compute-1 ceph-mon[80754]: pgmap v558: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 6.2 KiB/s rd, 0 B/s wr, 10 op/s
Nov 29 06:31:04 compute-1 sudo[132639]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:04.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:05.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:05 compute-1 sudo[132792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcekydigxmgsoossqoxykfcounpzyvuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397864.9397333-356-81360398573097/AnsiballZ_systemd.py'
Nov 29 06:31:05 compute-1 sudo[132792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:05 compute-1 ceph-mon[80754]: pgmap v559: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 40 KiB/s rd, 0 B/s wr, 66 op/s
Nov 29 06:31:05 compute-1 python3.9[132794]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:31:05 compute-1 sudo[132792]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:06 compute-1 podman[132871]: 2025-11-29 06:31:06.426909742 +0000 UTC m=+0.142793105 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 06:31:06 compute-1 python3.9[132974]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:31:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:06.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:31:07 compute-1 sudo[133090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:07 compute-1 sudo[133090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:07 compute-1 sudo[133090]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:07 compute-1 sudo[133121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:31:07 compute-1 sudo[133121]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:07 compute-1 sudo[133121]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:07.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:07 compute-1 sudo[133146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:07 compute-1 sudo[133146]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:07 compute-1 sudo[133146]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:07 compute-1 python3.9[133101]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397866.1817796-380-140182130432464/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:07 compute-1 sudo[133171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:31:07 compute-1 sudo[133171]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:07 compute-1 sudo[133171]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:07 compute-1 python3.9[133366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:08 compute-1 ceph-mon[80754]: pgmap v560: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 40 KiB/s rd, 0 B/s wr, 66 op/s
Nov 29 06:31:08 compute-1 python3.9[133496]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397867.419116-380-254305290288700/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:08.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:09.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:09 compute-1 ceph-mon[80754]: pgmap v561: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 67 KiB/s rd, 0 B/s wr, 112 op/s
Nov 29 06:31:09 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:31:09 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:31:10 compute-1 python3.9[133646]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:10 compute-1 python3.9[133767]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397869.5160851-512-258296720298783/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:10.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:10 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 06:31:10 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:31:10 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:31:10 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:31:10 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:31:10 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:31:10 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:31:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:11.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:11 compute-1 python3.9[133917]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:12 compute-1 python3.9[134038]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397870.8335085-512-189465354089049/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:12 compute-1 ceph-mon[80754]: pgmap v562: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 75 KiB/s rd, 0 B/s wr, 124 op/s
Nov 29 06:31:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:31:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:12.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:31:12 compute-1 python3.9[134188]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:31:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:13.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:13 compute-1 sudo[134340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmeterlkobagviwpwmqxdfvwqfkfcwhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397873.091043-626-225266925173138/AnsiballZ_file.py'
Nov 29 06:31:13 compute-1 sudo[134340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:13 compute-1 python3.9[134342]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:13 compute-1 sudo[134340]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:14 compute-1 sudo[134492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejxcxhbvhjjmgrwoqcnubujybfwshucx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397873.9167194-650-275322969805982/AnsiballZ_stat.py'
Nov 29 06:31:14 compute-1 sudo[134492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:14 compute-1 python3.9[134494]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:14 compute-1 sudo[134492]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:14 compute-1 sudo[134570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekjukkoctngmtjquyqmkzxqmcsxnteco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397873.9167194-650-275322969805982/AnsiballZ_file.py'
Nov 29 06:31:14 compute-1 sudo[134570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:14.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:14 compute-1 python3.9[134572]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:14 compute-1 sudo[134570]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:15.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:15 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:15 compute-1 sudo[134722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgiwvqigemgzkytxultnsbstjyjztbyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397875.0788314-650-222989930750953/AnsiballZ_stat.py'
Nov 29 06:31:15 compute-1 sudo[134722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:15 compute-1 python3.9[134724]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:15 compute-1 sudo[134722]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:16 compute-1 sudo[134800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vulgkvrfjesxzhmdbwayrjuweyzqtwsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397875.0788314-650-222989930750953/AnsiballZ_file.py'
Nov 29 06:31:16 compute-1 sudo[134800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:16 compute-1 python3.9[134802]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:16 compute-1 sudo[134800]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:16.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:16 compute-1 sudo[134952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqydgdhucurzevshtjmijiszbpezbqnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397876.5568545-719-134590063667189/AnsiballZ_file.py'
Nov 29 06:31:16 compute-1 sudo[134952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:17 compute-1 python3.9[134954]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:17 compute-1 sudo[134952]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:17.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:17 compute-1 sudo[135104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iovigbbkkgibeoqizoceeujyrfgulwpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397877.4101925-743-94253630551563/AnsiballZ_stat.py'
Nov 29 06:31:17 compute-1 sudo[135104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:17 compute-1 python3.9[135106]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:17 compute-1 sudo[135104]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:18 compute-1 sudo[135182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikvesrcfjrldbplyqwnochfosmfymbfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397877.4101925-743-94253630551563/AnsiballZ_file.py'
Nov 29 06:31:18 compute-1 sudo[135182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:18 compute-1 python3.9[135184]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:18 compute-1 sudo[135182]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:18.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:19.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:19 compute-1 sudo[135334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhihddxhkqwnsscgsajkgzllrbdjfpjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397879.0690598-779-127714116044548/AnsiballZ_stat.py'
Nov 29 06:31:19 compute-1 sudo[135334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:19 compute-1 python3.9[135336]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:19 compute-1 sudo[135334]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:19 compute-1 sudo[135412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heooimnsepoqnycnshxrjjgqcpvmzkhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397879.0690598-779-127714116044548/AnsiballZ_file.py'
Nov 29 06:31:19 compute-1 sudo[135412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:20 compute-1 python3.9[135414]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:20 compute-1 sudo[135412]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:20 compute-1 ceph-mon[80754]: pgmap v563: 305 pgs: 305 active+clean; 456 KiB data, 149 MiB used, 21 GiB / 21 GiB avail; 86 KiB/s rd, 0 B/s wr, 142 op/s
Nov 29 06:31:20 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:20.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:20 compute-1 sudo[135564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wktydvlkkssvwjvjdjxsiqrwenkpvhmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397880.4510849-815-35759691474252/AnsiballZ_systemd.py'
Nov 29 06:31:20 compute-1 sudo[135564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:21.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:21 compute-1 python3.9[135566]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:31:21 compute-1 systemd[1]: Reloading.
Nov 29 06:31:21 compute-1 systemd-rc-local-generator[135592]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:31:21 compute-1 systemd-sysv-generator[135596]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:31:21 compute-1 ceph-mon[80754]: pgmap v564: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 88 KiB/s rd, 0 B/s wr, 146 op/s
Nov 29 06:31:21 compute-1 ceph-mon[80754]: pgmap v565: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 54 KiB/s rd, 0 B/s wr, 90 op/s
Nov 29 06:31:21 compute-1 ceph-mon[80754]: pgmap v566: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 54 KiB/s rd, 0 B/s wr, 90 op/s
Nov 29 06:31:21 compute-1 ceph-mon[80754]: pgmap v567: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 26 KiB/s rd, 0 B/s wr, 44 op/s
Nov 29 06:31:21 compute-1 sudo[135564]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:22 compute-1 sudo[135753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sywzsvnsilslkjvmxsvrdtubnjtxyjjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397881.8746898-839-114055753079389/AnsiballZ_stat.py'
Nov 29 06:31:22 compute-1 sudo[135753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:22 compute-1 python3.9[135755]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:22 compute-1 sudo[135753]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:22 compute-1 sudo[135831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpgjqwgjxofahkowpicwykpvfmjrhrgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397881.8746898-839-114055753079389/AnsiballZ_file.py'
Nov 29 06:31:22 compute-1 sudo[135831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:22.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:22 compute-1 python3.9[135833]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:22 compute-1 sudo[135831]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:23.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:23 compute-1 ceph-mon[80754]: pgmap v568: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 19 KiB/s rd, 0 B/s wr, 32 op/s
Nov 29 06:31:23 compute-1 sudo[135983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egqhnqyqffokuzusynfpurywujezgzls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397883.3680463-875-32799337540741/AnsiballZ_stat.py'
Nov 29 06:31:23 compute-1 sudo[135983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:23 compute-1 python3.9[135985]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:23 compute-1 sudo[135983]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:24 compute-1 sudo[136061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqmnntjjjnbvqekfeluwqvxnmipycsmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397883.3680463-875-32799337540741/AnsiballZ_file.py'
Nov 29 06:31:24 compute-1 sudo[136061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:24 compute-1 python3.9[136063]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:24 compute-1 sudo[136061]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:31:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:24.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:31:25 compute-1 sudo[136213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvdouxryutjssayfmaftdvuwwsoppmmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397884.7100022-911-204977229698263/AnsiballZ_systemd.py'
Nov 29 06:31:25 compute-1 sudo[136213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:25.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:25 compute-1 python3.9[136215]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:31:25 compute-1 systemd[1]: Reloading.
Nov 29 06:31:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:25 compute-1 systemd-rc-local-generator[136243]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:31:25 compute-1 systemd-sysv-generator[136247]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:31:25 compute-1 systemd[1]: Starting Create netns directory...
Nov 29 06:31:25 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:31:25 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:31:25 compute-1 systemd[1]: Finished Create netns directory.
Nov 29 06:31:25 compute-1 sudo[136213]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:26 compute-1 ceph-mon[80754]: pgmap v569: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 8.2 KiB/s rd, 0 B/s wr, 13 op/s
Nov 29 06:31:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:31:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:26.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:31:27 compute-1 sudo[136407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkvmvgvqdettoftdsekiyxcpkmwqlhtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397886.7226136-941-187049701108094/AnsiballZ_file.py'
Nov 29 06:31:27 compute-1 sudo[136407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:27 compute-1 ceph-mon[80754]: pgmap v570: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:27 compute-1 python3.9[136409]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:27.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:27 compute-1 sudo[136407]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:27 compute-1 sudo[136559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkpffgueklzwhgdmoppgrnekgxslriet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397887.4206333-965-64710958266798/AnsiballZ_stat.py'
Nov 29 06:31:27 compute-1 sudo[136559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:27 compute-1 python3.9[136561]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:27 compute-1 sudo[136559]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:28 compute-1 sudo[136682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isjbiqoaayqqfuxikcnynsrmzkmsvpzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397887.4206333-965-64710958266798/AnsiballZ_copy.py'
Nov 29 06:31:28 compute-1 sudo[136682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:28 compute-1 python3.9[136684]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397887.4206333-965-64710958266798/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:28 compute-1 sudo[136682]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:28.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:29.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:29 compute-1 sudo[136834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csacohgwfnbbdiqfxlnoyrcxvwdnwbka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397889.2149665-1016-82147930189464/AnsiballZ_file.py'
Nov 29 06:31:29 compute-1 sudo[136834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:29 compute-1 python3.9[136836]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:29 compute-1 sudo[136834]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:29 compute-1 ceph-mon[80754]: pgmap v571: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:30 compute-1 sudo[136986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mprdqbcctpwwmmbkwpebktimzdryxuvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397889.995605-1040-72687177317324/AnsiballZ_stat.py'
Nov 29 06:31:30 compute-1 sudo[136986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:30 compute-1 python3.9[136988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:30 compute-1 sudo[136986]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:30.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:30 compute-1 sudo[137109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxwydijrqxkdomktowuruurxirrciyaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397889.995605-1040-72687177317324/AnsiballZ_copy.py'
Nov 29 06:31:30 compute-1 sudo[137109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:31 compute-1 python3.9[137111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397889.995605-1040-72687177317324/.source.json _original_basename=._gnf33cx follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:31 compute-1 sudo[137109]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.203689) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891203890, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1376, "num_deletes": 252, "total_data_size": 3197835, "memory_usage": 3240768, "flush_reason": "Manual Compaction"}
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891224566, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 2087601, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9208, "largest_seqno": 10579, "table_properties": {"data_size": 2081808, "index_size": 3124, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12444, "raw_average_key_size": 19, "raw_value_size": 2069865, "raw_average_value_size": 3254, "num_data_blocks": 144, "num_entries": 636, "num_filter_entries": 636, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397737, "oldest_key_time": 1764397737, "file_creation_time": 1764397891, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 20919 microseconds, and 8458 cpu microseconds.
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.224628) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 2087601 bytes OK
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.224651) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.227122) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.227142) EVENT_LOG_v1 {"time_micros": 1764397891227135, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.227162) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 3191443, prev total WAL file size 3191443, number of live WAL files 2.
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.228320) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(2038KB)], [18(9227KB)]
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891228429, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 11536895, "oldest_snapshot_seqno": -1}
Nov 29 06:31:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:31.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4002 keys, 9547481 bytes, temperature: kUnknown
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891302200, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 9547481, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9515989, "index_size": 20374, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10053, "raw_key_size": 98620, "raw_average_key_size": 24, "raw_value_size": 9438770, "raw_average_value_size": 2358, "num_data_blocks": 889, "num_entries": 4002, "num_filter_entries": 4002, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764397891, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.302734) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 9547481 bytes
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.304408) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 155.7 rd, 128.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 9.0 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(10.1) write-amplify(4.6) OK, records in: 4522, records dropped: 520 output_compression: NoCompression
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.304431) EVENT_LOG_v1 {"time_micros": 1764397891304421, "job": 8, "event": "compaction_finished", "compaction_time_micros": 74110, "compaction_time_cpu_micros": 25635, "output_level": 6, "num_output_files": 1, "total_output_size": 9547481, "num_input_records": 4522, "num_output_records": 4002, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891305290, "job": 8, "event": "table_file_deletion", "file_number": 20}
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891307341, "job": 8, "event": "table_file_deletion", "file_number": 18}
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.228101) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.307584) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.307594) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.307597) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.307600) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.307602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:31:31 compute-1 ceph-mon[80754]: pgmap v572: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:32 compute-1 sudo[137261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkadczexkfttgpuzgtpaxktvrtgfsszb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397891.9171689-1085-64153671637314/AnsiballZ_file.py'
Nov 29 06:31:32 compute-1 sudo[137261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:32 compute-1 python3.9[137263]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:32 compute-1 sudo[137261]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:32.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:33.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:33 compute-1 sudo[137413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjtqnjcmytfdfltkjacrtrxtmoupppuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397893.0575542-1109-163519045859029/AnsiballZ_stat.py'
Nov 29 06:31:33 compute-1 sudo[137413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:33 compute-1 sudo[137413]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:33 compute-1 sudo[137536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdtthlvivcfrieoemtvzqqqvtfgabesq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397893.0575542-1109-163519045859029/AnsiballZ_copy.py'
Nov 29 06:31:33 compute-1 sudo[137536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:34 compute-1 sudo[137536]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:34.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:35 compute-1 ceph-mon[80754]: pgmap v573: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:35 compute-1 sudo[137688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnxfvodbpudpvrwayeaelkrhmytpiaya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397894.6444488-1160-156324430443645/AnsiballZ_container_config_data.py'
Nov 29 06:31:35 compute-1 sudo[137688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:31:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:35.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:31:35 compute-1 python3.9[137690]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 29 06:31:35 compute-1 sudo[137688]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:35 compute-1 sshd-session[137691]: Received disconnect from 71.70.164.48 port 45162:11: Bye Bye [preauth]
Nov 29 06:31:35 compute-1 sshd-session[137691]: Disconnected from authenticating user root 71.70.164.48 port 45162 [preauth]
Nov 29 06:31:36 compute-1 sudo[137842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tietbjupuzedxdiehzfmnxcudwavbscw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397895.69024-1187-2648692373773/AnsiballZ_container_config_hash.py'
Nov 29 06:31:36 compute-1 sudo[137842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:36 compute-1 python3.9[137844]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:31:36 compute-1 sudo[137842]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:36.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:37.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:37 compute-1 podman[137869]: 2025-11-29 06:31:37.380468316 +0000 UTC m=+0.122874980 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 06:31:37 compute-1 ceph-mon[80754]: pgmap v574: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:37 compute-1 ceph-mon[80754]: pgmap v575: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:37 compute-1 sudo[138019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhzewnhzaekxnpmhaqxnxoblwnnmplnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397897.5324492-1214-82540766796723/AnsiballZ_podman_container_info.py'
Nov 29 06:31:37 compute-1 sudo[138019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:38 compute-1 python3.9[138021]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 06:31:38 compute-1 sudo[138019]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:38.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:31:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:39.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:31:39 compute-1 sudo[138198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avwzkaclugdgnttjvundmwrnksdzdmin ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397899.29751-1253-68951393117446/AnsiballZ_edpm_container_manage.py'
Nov 29 06:31:39 compute-1 sudo[138198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:40 compute-1 python3[138200]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:31:40 compute-1 ceph-mon[80754]: pgmap v576: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:31:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:40.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:31:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:41.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:41 compute-1 ceph-mon[80754]: pgmap v577: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:42.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:31:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:43.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:31:44 compute-1 ceph-mon[80754]: pgmap v578: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:44.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:45.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:45 compute-1 sudo[138278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:31:45 compute-1 sudo[138278]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:45 compute-1 sudo[138278]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:45 compute-1 sudo[138303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:31:45 compute-1 sudo[138303]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:31:45 compute-1 sudo[138303]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:46.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:31:46 compute-1 ceph-mon[80754]: pgmap v579: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:47.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:48.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:49.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:50.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:31:51 compute-1 ceph-mon[80754]: pgmap v580: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:31:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:51.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:31:51 compute-1 podman[138213]: 2025-11-29 06:31:51.561648508 +0000 UTC m=+11.292585137 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:31:51 compute-1 podman[138397]: 2025-11-29 06:31:51.729919259 +0000 UTC m=+0.052782612 container create b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:31:51 compute-1 podman[138397]: 2025-11-29 06:31:51.699180936 +0000 UTC m=+0.022044339 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:31:51 compute-1 python3[138200]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:31:51 compute-1 sudo[138198]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:52 compute-1 ceph-mon[80754]: pgmap v581: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:52 compute-1 ceph-mon[80754]: pgmap v582: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:52.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:53.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:53 compute-1 ceph-mon[80754]: pgmap v583: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:54.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:55.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:31:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:56.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:57.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:31:57 compute-1 ceph-mon[80754]: pgmap v584: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:31:58 compute-1 sshd-session[138461]: Received disconnect from 66.94.122.234 port 37306:11: Bye Bye [preauth]
Nov 29 06:31:58 compute-1 sshd-session[138461]: Disconnected from authenticating user root 66.94.122.234 port 37306 [preauth]
Nov 29 06:31:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:31:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:58.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:31:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:31:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:31:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:59.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:00 compute-1 ceph-mon[80754]: pgmap v585: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:32:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:00.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:32:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:01.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:02.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:03 compute-1 ceph-mon[80754]: pgmap v586: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:03 compute-1 ceph-mon[80754]: pgmap v587: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:03.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:04.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:05.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:05 compute-1 ceph-mon[80754]: pgmap v588: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:06.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:07.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:07 compute-1 ceph-mon[80754]: pgmap v589: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:08 compute-1 podman[138463]: 2025-11-29 06:32:08.404409203 +0000 UTC m=+0.138401523 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:32:08 compute-1 sudo[138614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpsmntfswmshfbzqumytowtoaxqkypoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397928.3763392-1277-178904532944121/AnsiballZ_stat.py'
Nov 29 06:32:08 compute-1 sudo[138614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:08 compute-1 ceph-mon[80754]: pgmap v590: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:08.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:08 compute-1 python3.9[138616]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:32:08 compute-1 sudo[138614]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:09.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:09 compute-1 sudo[138768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvzqcgagzbjxnjnvwtsgcridrmwnahyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397929.179295-1304-182542764344255/AnsiballZ_file.py'
Nov 29 06:32:09 compute-1 sudo[138768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:09 compute-1 python3.9[138770]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:09 compute-1 sudo[138768]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:09 compute-1 sudo[138844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzyencfkezdmjwfzawrdosmicuagaecu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397929.179295-1304-182542764344255/AnsiballZ_stat.py'
Nov 29 06:32:09 compute-1 sudo[138844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:10 compute-1 python3.9[138846]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:32:10 compute-1 sudo[138844]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:10 compute-1 ceph-mon[80754]: pgmap v591: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:10 compute-1 sudo[138995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kccymrphmrrftndwrpsxrjhjrigrrtfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397930.208002-1304-140102981237999/AnsiballZ_copy.py'
Nov 29 06:32:10 compute-1 sudo[138995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:10.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:10 compute-1 python3.9[138997]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397930.208002-1304-140102981237999/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:10 compute-1 sudo[138995]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:11 compute-1 sudo[139071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxhlzqrnpbmbwhlmliycbsekngknkwue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397930.208002-1304-140102981237999/AnsiballZ_systemd.py'
Nov 29 06:32:11 compute-1 sudo[139071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:11.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:11 compute-1 python3.9[139073]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:32:11 compute-1 systemd[1]: Reloading.
Nov 29 06:32:11 compute-1 systemd-rc-local-generator[139102]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:11 compute-1 systemd-sysv-generator[139105]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:11 compute-1 sudo[139071]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:12 compute-1 sudo[139184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmrnlrulkrmzqayckjwenykbhqsovyss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397930.208002-1304-140102981237999/AnsiballZ_systemd.py'
Nov 29 06:32:12 compute-1 sudo[139184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:12.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:12 compute-1 ceph-mon[80754]: pgmap v592: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:13 compute-1 python3.9[139186]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:13 compute-1 systemd[1]: Reloading.
Nov 29 06:32:13 compute-1 systemd-rc-local-generator[139216]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:13 compute-1 systemd-sysv-generator[139219]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:13 compute-1 sshd-session[139132]: Invalid user solana from 80.94.92.182 port 48768
Nov 29 06:32:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:32:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:13.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:32:13 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Nov 29 06:32:13 compute-1 sshd-session[139132]: Connection closed by invalid user solana 80.94.92.182 port 48768 [preauth]
Nov 29 06:32:13 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:32:13 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5980339fbd23606fd60482cabb2ffd0e0b84bee0e6bfb5159a1075dd20c3eed/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 29 06:32:13 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5980339fbd23606fd60482cabb2ffd0e0b84bee0e6bfb5159a1075dd20c3eed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:32:13 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2.
Nov 29 06:32:13 compute-1 podman[139226]: 2025-11-29 06:32:13.658706122 +0000 UTC m=+0.288601274 container init b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: + sudo -E kolla_set_configs
Nov 29 06:32:13 compute-1 podman[139226]: 2025-11-29 06:32:13.688980352 +0000 UTC m=+0.318875494 container start b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:32:13 compute-1 edpm-start-podman-container[139226]: ovn_metadata_agent
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Validating config file
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Copying service configuration files
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Writing out command to execute
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 29 06:32:13 compute-1 edpm-start-podman-container[139225]: Creating additional drop-in dependency for "ovn_metadata_agent" (b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2)
Nov 29 06:32:13 compute-1 podman[139248]: 2025-11-29 06:32:13.770974114 +0000 UTC m=+0.053844221 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: ++ cat /run_command
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: + CMD=neutron-ovn-metadata-agent
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: + ARGS=
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: + sudo kolla_copy_cacerts
Nov 29 06:32:13 compute-1 systemd[1]: Reloading.
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: Running command: 'neutron-ovn-metadata-agent'
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: + [[ ! -n '' ]]
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: + . kolla_extend_start
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: + umask 0022
Nov 29 06:32:13 compute-1 ovn_metadata_agent[139241]: + exec neutron-ovn-metadata-agent
Nov 29 06:32:13 compute-1 systemd-sysv-generator[139322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:13 compute-1 systemd-rc-local-generator[139317]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:14 compute-1 systemd[1]: Started ovn_metadata_agent container.
Nov 29 06:32:14 compute-1 sudo[139184]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:14 compute-1 ceph-mon[80754]: pgmap v593: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:14.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:15.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:15 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.849 139246 INFO neutron.common.config [-] Logging enabled!
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.849 139246 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.849 139246 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.860 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.860 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.860 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.860 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.860 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.861 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.861 139246 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.861 139246 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.861 139246 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.861 139246 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ceph-mon[80754]: pgmap v594: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.889 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.889 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.889 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.898 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.898 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.898 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.899 139246 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.899 139246 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.912 139246 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2fa83236-07b6-4ff7-bb56-9f4f13bed719 (UUID: 2fa83236-07b6-4ff7-bb56-9f4f13bed719) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.938 139246 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.938 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.938 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.938 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.941 139246 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.947 139246 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.953 139246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2fa83236-07b6-4ff7-bb56-9f4f13bed719'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f40cd05d6d0>], external_ids={}, name=2fa83236-07b6-4ff7-bb56-9f4f13bed719, nb_cfg_timestamp=1764397844667, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.955 139246 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f40cd056f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.956 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.956 139246 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.956 139246 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.956 139246 INFO oslo_service.service [-] Starting 1 workers
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.961 139246 DEBUG oslo_service.service [-] Started child 139354 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.965 139246 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpl53aly_m/privsep.sock']
Nov 29 06:32:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.966 139354 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-168415'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.001 139354 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.002 139354 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.002 139354 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.010 139354 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.018 139354 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.025 139354 INFO eventlet.wsgi.server [-] (139354) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 29 06:32:16 compute-1 sshd-session[130646]: Connection closed by 192.168.122.30 port 38708
Nov 29 06:32:16 compute-1 sshd-session[130643]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:32:16 compute-1 systemd[1]: session-47.scope: Deactivated successfully.
Nov 29 06:32:16 compute-1 systemd[1]: session-47.scope: Consumed 1min 847ms CPU time.
Nov 29 06:32:16 compute-1 systemd-logind[785]: Session 47 logged out. Waiting for processes to exit.
Nov 29 06:32:16 compute-1 systemd-logind[785]: Removed session 47.
Nov 29 06:32:16 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.742 139246 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.742 139246 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpl53aly_m/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.590 139359 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.595 139359 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.597 139359 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.597 139359 INFO oslo.privsep.daemon [-] privsep daemon running as pid 139359
Nov 29 06:32:16 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.745 139359 DEBUG oslo.privsep.daemon [-] privsep: reply[31ff0470-bc39-4c86-a9ef-d058219f2c8b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:32:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:16.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:16 compute-1 ceph-mon[80754]: pgmap v595: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.277 139359 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.277 139359 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.277 139359 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:32:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:17.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.846 139359 DEBUG oslo.privsep.daemon [-] privsep: reply[18f9256b-8ccd-4fdf-8dd6-478b787b6a3f]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.849 139246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2fa83236-07b6-4ff7-bb56-9f4f13bed719, column=external_ids, values=({'neutron:ovn-metadata-id': '0373c087-79f4-5325-b3bb-60a5df9a729a'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.859 139246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2fa83236-07b6-4ff7-bb56-9f4f13bed719, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.868 139246 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.868 139246 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.868 139246 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:17 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:32:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:18.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:19.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:20 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:20.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:21.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:22 compute-1 sshd-session[139364]: Accepted publickey for zuul from 192.168.122.30 port 45816 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:32:22 compute-1 systemd-logind[785]: New session 48 of user zuul.
Nov 29 06:32:22 compute-1 systemd[1]: Started Session 48 of User zuul.
Nov 29 06:32:22 compute-1 sshd-session[139364]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:32:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:22.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:23 compute-1 python3.9[139517]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:32:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:23.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:24 compute-1 sudo[139671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jukmvojwjcimyxoxzbazbttfqzebpxfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397944.054351-68-238417787647698/AnsiballZ_command.py'
Nov 29 06:32:24 compute-1 sudo[139671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:32:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:24.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:32:25 compute-1 python3.9[139673]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:32:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:25.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:25 compute-1 sudo[139671]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:25 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:32:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:32:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:26.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:32:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:32:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:27.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:32:27 compute-1 sudo[139835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqeuebtgnelrcbfahtktsnicfeurqaho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397946.9847538-101-203662153998489/AnsiballZ_systemd_service.py'
Nov 29 06:32:27 compute-1 sudo[139835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:27 compute-1 python3.9[139837]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:32:27 compute-1 systemd[1]: Reloading.
Nov 29 06:32:28 compute-1 systemd-rc-local-generator[139865]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:28 compute-1 systemd-sysv-generator[139868]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:28 compute-1 sudo[139835]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:28.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:29.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:29 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:32:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).paxos(paxos updating c 503..1046) lease_timeout -- calling new election
Nov 29 06:32:29 compute-1 ceph-mon[80754]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Nov 29 06:32:29 compute-1 ceph-mon[80754]: paxos.2).electionLogic(22) init, last seen epoch 22
Nov 29 06:32:29 compute-1 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:32:30 compute-1 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:32:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:30.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:32:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:31.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:32:32 compute-1 python3.9[140022]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:32:32 compute-1 network[140039]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:32:32 compute-1 network[140040]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:32:32 compute-1 network[140041]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:32:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:32.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:33.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:33 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:32:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:32:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:34.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:32:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:35.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:35 compute-1 ceph-mon[80754]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Nov 29 06:32:35 compute-1 ceph-mon[80754]: paxos.2).electionLogic(26) init, last seen epoch 26
Nov 29 06:32:35 compute-1 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:32:35 compute-1 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:32:36 compute-1 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:32:36 compute-1 ceph-mon[80754]: pgmap v596: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:36 compute-1 sudo[140301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzgllqqvpauqxhzdeilbowviaxikhxwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397956.4860334-158-104455865448651/AnsiballZ_systemd_service.py'
Nov 29 06:32:36 compute-1 sudo[140301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 06:32:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:36.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:37 compute-1 python3.9[140303]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:37 compute-1 sudo[140301]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:37.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:37 compute-1 sudo[140454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwlcbvvnnlfkzxusoseioitqpncoioer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397957.3404675-158-206076630161978/AnsiballZ_systemd_service.py'
Nov 29 06:32:37 compute-1 sudo[140454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:37 compute-1 python3.9[140456]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:38 compute-1 sudo[140454]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:38 compute-1 sudo[140623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwrqarqigsvnfxajdtfyxeamlzsnfqjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397958.2089226-158-109755441952870/AnsiballZ_systemd_service.py'
Nov 29 06:32:38 compute-1 sudo[140623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:38 compute-1 podman[140581]: 2025-11-29 06:32:38.631769419 +0000 UTC m=+0.125217645 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 06:32:38 compute-1 ceph-mon[80754]: pgmap v599: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-1 ceph-mon[80754]: pgmap v600: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-1 ceph-mon[80754]: pgmap v601: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-1 ceph-mon[80754]: mon.compute-1 calling monitor election
Nov 29 06:32:38 compute-1 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 06:32:38 compute-1 ceph-mon[80754]: pgmap v602: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-1 ceph-mon[80754]: pgmap v603: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-1 ceph-mon[80754]: pgmap v604: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-1 ceph-mon[80754]: mon.compute-2 is new leader, mons compute-2,compute-1 in quorum (ranks 1,2)
Nov 29 06:32:38 compute-1 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 06:32:38 compute-1 ceph-mon[80754]: overall HEALTH_OK
Nov 29 06:32:38 compute-1 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 06:32:38 compute-1 ceph-mon[80754]: mon.compute-1 calling monitor election
Nov 29 06:32:38 compute-1 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 06:32:38 compute-1 ceph-mon[80754]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 06:32:38 compute-1 ceph-mon[80754]: pgmap v605: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:38 compute-1 ceph-mon[80754]: monmap e3: 3 mons at {compute-0=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],compute-1=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],compute-2=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]} removed_ranks: {} disallowed_leaders: {}
Nov 29 06:32:38 compute-1 ceph-mon[80754]: fsmap cephfs:1 {0=cephfs.compute-2.gxdwyy=up:active} 2 up:standby
Nov 29 06:32:38 compute-1 ceph-mon[80754]: osdmap e139: 3 total, 3 up, 3 in
Nov 29 06:32:38 compute-1 ceph-mon[80754]: mgrmap e10: compute-0.vxabpq(active, since 15m), standbys: compute-2.ngsyhe, compute-1.gaxpay
Nov 29 06:32:38 compute-1 ceph-mon[80754]: overall HEALTH_OK
Nov 29 06:32:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:38.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:38 compute-1 python3.9[140629]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:38 compute-1 sudo[140623]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:39.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:39 compute-1 sudo[140785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jogtuaarjqstmqvcciscxbaclsdvcltw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397959.0823193-158-271383688165656/AnsiballZ_systemd_service.py'
Nov 29 06:32:39 compute-1 sudo[140785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:39 compute-1 python3.9[140787]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:39 compute-1 sudo[140785]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:40 compute-1 sudo[140938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvkbbylmioihtsvfqzxyfcbiwfsprofg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397959.8654532-158-21846904270294/AnsiballZ_systemd_service.py'
Nov 29 06:32:40 compute-1 sudo[140938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:40 compute-1 python3.9[140940]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:40 compute-1 sudo[140938]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:40.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:41 compute-1 sudo[141091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsvhyuulzztjellgsmrvldjnhksgbvqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397960.776036-158-239309936202692/AnsiballZ_systemd_service.py'
Nov 29 06:32:41 compute-1 sudo[141091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:41 compute-1 ceph-mon[80754]: pgmap v606: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:41.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:41 compute-1 python3.9[141093]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:42 compute-1 sudo[141091]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:42.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:42 compute-1 sudo[141244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imrnxlmelgomamkowdcnzkaugpfvboew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397962.6474686-158-46816823119706/AnsiballZ_systemd_service.py'
Nov 29 06:32:42 compute-1 sudo[141244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:43 compute-1 python3.9[141246]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:43.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:43 compute-1 sudo[141244]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:43 compute-1 ceph-mon[80754]: pgmap v607: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:44 compute-1 podman[141347]: 2025-11-29 06:32:44.359309977 +0000 UTC m=+0.081740782 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 06:32:44 compute-1 sudo[141416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlnjeculzkuffvctgbktvvgulnbujpqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397963.8562067-314-114046652199775/AnsiballZ_file.py'
Nov 29 06:32:44 compute-1 sudo[141416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:44 compute-1 python3.9[141418]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:44 compute-1 sudo[141416]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:44.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:45 compute-1 sudo[141569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dloojbokriorukqkjcmotsxwrlaldwbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397964.8018732-314-86337057438647/AnsiballZ_file.py'
Nov 29 06:32:45 compute-1 sudo[141569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:45.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:45 compute-1 python3.9[141571]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:45 compute-1 sudo[141569]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:45 compute-1 sudo[141721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeilfqtvwfshqbppmfkldvlsimweohph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397965.5619361-314-223651010492266/AnsiballZ_file.py'
Nov 29 06:32:45 compute-1 sudo[141721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:46 compute-1 sudo[141724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:32:46 compute-1 sudo[141724]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:46 compute-1 sudo[141724]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-1 python3.9[141723]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:46 compute-1 ceph-mon[80754]: pgmap v608: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:46 compute-1 sudo[141749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:32:46 compute-1 sudo[141749]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:46 compute-1 sudo[141749]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-1 sudo[141721]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-1 sudo[141774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:32:46 compute-1 sudo[141774]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:46 compute-1 sudo[141774]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-1 sudo[141824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:32:46 compute-1 sudo[141824]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:32:46 compute-1 sudo[141991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyhkfjlfijqjoxhzuvxgbioqoijufxiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397966.3379478-314-178104237809612/AnsiballZ_file.py'
Nov 29 06:32:46 compute-1 sudo[141991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:46 compute-1 sudo[141824]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:46.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:47.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:47 compute-1 python3.9[141993]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:47 compute-1 sudo[141991]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:48 compute-1 sudo[142157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fafwzfmotzdhzftoiznetayvaaqkrgug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397967.7563534-314-262931094851305/AnsiballZ_file.py'
Nov 29 06:32:48 compute-1 sudo[142157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:48 compute-1 python3.9[142159]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:48 compute-1 sudo[142157]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:48 compute-1 sudo[142309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osztbgxuuostkzbbyjzmmxqgrmeppqff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397968.4489598-314-18277865709695/AnsiballZ_file.py'
Nov 29 06:32:48 compute-1 sudo[142309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:48.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:49 compute-1 python3.9[142311]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:49 compute-1 sudo[142309]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:49.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:49 compute-1 sudo[142461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmmkyiciwtzyhpclzbfqjiuulexloteu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397969.2037697-314-135187497552381/AnsiballZ_file.py'
Nov 29 06:32:49 compute-1 sudo[142461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:49 compute-1 python3.9[142463]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:49 compute-1 sudo[142461]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:50.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:51.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:52 compute-1 sudo[142613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkyetlldvqvetuitovjxzgptzkbehitq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397972.0457942-464-105319807531592/AnsiballZ_file.py'
Nov 29 06:32:52 compute-1 sudo[142613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:52 compute-1 python3.9[142615]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:52 compute-1 sudo[142613]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:32:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:52.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:32:53 compute-1 sudo[142765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewccdlqlfojpuykknywhqtgayepjkdbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397972.7777722-464-131283123392106/AnsiballZ_file.py'
Nov 29 06:32:53 compute-1 sudo[142765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:53 compute-1 python3.9[142767]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:53 compute-1 sudo[142765]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:53.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:53 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:53 compute-1 sudo[142917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhgbmshsyqrnmqsoitdcouhwmxjxcphp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397973.487426-464-155220495884534/AnsiballZ_file.py'
Nov 29 06:32:53 compute-1 sudo[142917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:53 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:32:53 compute-1 python3.9[142919]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:54 compute-1 sudo[142917]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:54 compute-1 sudo[143069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sljrtktuskxuoebuzyyizykgzqvopiyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397974.1487572-464-13552417952327/AnsiballZ_file.py'
Nov 29 06:32:54 compute-1 sudo[143069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:54 compute-1 python3.9[143071]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:54 compute-1 sudo[143069]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:54.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:55 compute-1 sudo[143221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyitxzbdwygxitwqepavsvoqixraicat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397974.8698707-464-45972265003458/AnsiballZ_file.py'
Nov 29 06:32:55 compute-1 sudo[143221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:55.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:55 compute-1 python3.9[143223]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:55 compute-1 sudo[143221]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:55 compute-1 ceph-mon[80754]: pgmap v609: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:55 compute-1 sudo[143373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onjzftnsecditsdpjcydopmmccetodhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397975.6743407-464-185621334992288/AnsiballZ_file.py'
Nov 29 06:32:55 compute-1 sudo[143373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:56 compute-1 python3.9[143375]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:56 compute-1 sudo[143373]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:56 compute-1 sudo[143525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrpjvdkwboimutxsyenvnhzmdwoiwofj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397976.3690665-464-53454690173711/AnsiballZ_file.py'
Nov 29 06:32:56 compute-1 sudo[143525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:56 compute-1 python3.9[143527]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:56 compute-1 sudo[143525]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:56.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:32:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:57.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:32:57 compute-1 ceph-mon[80754]: pgmap v610: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:57 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:32:57 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:32:57 compute-1 ceph-mon[80754]: pgmap v611: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:57 compute-1 ceph-mon[80754]: pgmap v612: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:57 compute-1 ceph-mon[80754]: pgmap v613: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:57 compute-1 ceph-mon[80754]: pgmap v614: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:57 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:32:57 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:32:57 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:32:57 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:32:57 compute-1 ceph-mon[80754]: pgmap v615: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:32:58 compute-1 sudo[143677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqcjcvuhcfgtgmddzaqarfllomdaiktg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397978.2133174-617-64028371240035/AnsiballZ_command.py'
Nov 29 06:32:58 compute-1 sudo[143677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:58 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:32:58 compute-1 python3.9[143679]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:32:58 compute-1 sudo[143677]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:32:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:58.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:32:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:32:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:32:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:59.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:00 compute-1 python3.9[143831]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:33:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:00.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:01 compute-1 sudo[143981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvldyjsonllsajayslsyiirdooxycpkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397980.7668798-671-98065971910180/AnsiballZ_systemd_service.py'
Nov 29 06:33:01 compute-1 sudo[143981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:01 compute-1 python3.9[143983]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:33:01 compute-1 systemd[1]: Reloading.
Nov 29 06:33:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:01.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:01 compute-1 systemd-sysv-generator[144013]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:33:01 compute-1 systemd-rc-local-generator[144007]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:33:01 compute-1 ceph-mon[80754]: pgmap v616: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:02 compute-1 sudo[143981]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:02 compute-1 sudo[144168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifridlsbhcsfgoctucmvhsxdkysvmhgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397982.370808-695-42110475237237/AnsiballZ_command.py'
Nov 29 06:33:02 compute-1 sudo[144168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:02 compute-1 python3.9[144170]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:02 compute-1 sudo[144168]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:02.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:03 compute-1 sudo[144321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsxguhvkvukibzfirivqbmsndtuatlun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397983.0462074-695-206577537409806/AnsiballZ_command.py'
Nov 29 06:33:03 compute-1 sudo[144321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:03.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:03 compute-1 python3.9[144323]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:03 compute-1 sudo[144321]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:04 compute-1 sudo[144474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddbojqfaeoseolkulbdmlsfmuiydpsze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397983.7238002-695-206721067875393/AnsiballZ_command.py'
Nov 29 06:33:04 compute-1 sudo[144474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:04 compute-1 ceph-mon[80754]: pgmap v617: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:04 compute-1 ceph-mon[80754]: pgmap v618: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:04 compute-1 python3.9[144476]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:04 compute-1 sudo[144474]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:04 compute-1 sudo[144627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qunsbacpykmimxjbykeaaiqkdbhdmxxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397984.66756-695-191060157504558/AnsiballZ_command.py'
Nov 29 06:33:04 compute-1 sudo[144627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:04.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:05 compute-1 python3.9[144629]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:05 compute-1 sudo[144627]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:05.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:05 compute-1 sudo[144780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owynnzbtgidzzspycyggwqnjrteueeib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397985.3343692-695-268639209670779/AnsiballZ_command.py'
Nov 29 06:33:05 compute-1 sudo[144780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:05 compute-1 python3.9[144782]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:05 compute-1 sudo[144780]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:06 compute-1 sudo[144933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcxeavttbptqplwydhlusoslwgpudqfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397985.9777045-695-80748845099485/AnsiballZ_command.py'
Nov 29 06:33:06 compute-1 sudo[144933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:06 compute-1 python3.9[144935]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:06 compute-1 sudo[144933]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:06 compute-1 ceph-mon[80754]: pgmap v619: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:06.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:06 compute-1 sudo[145086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lshoecbpiflidbxqmxymvejvnkznllms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397986.676885-695-272526837031164/AnsiballZ_command.py'
Nov 29 06:33:06 compute-1 sudo[145086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:07 compute-1 python3.9[145088]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:07 compute-1 sudo[145086]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:07.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:08 compute-1 ceph-mon[80754]: pgmap v620: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:08.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:09 compute-1 sudo[145248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgnhwloxvojwjmtuggfdzxgivaxtound ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397988.804582-857-255765677388023/AnsiballZ_getent.py'
Nov 29 06:33:09 compute-1 sudo[145248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:09 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:09 compute-1 podman[145213]: 2025-11-29 06:33:09.372479916 +0000 UTC m=+0.135485413 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 29 06:33:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:09.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:09 compute-1 python3.9[145259]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 29 06:33:09 compute-1 sudo[145248]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:10 compute-1 ceph-mon[80754]: pgmap v621: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:10 compute-1 sudo[145418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqmqtqhsfzjiluabratdlzdigogmupsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397989.7479947-881-265200685324832/AnsiballZ_group.py'
Nov 29 06:33:10 compute-1 sudo[145418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:10 compute-1 python3.9[145420]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:33:10 compute-1 groupadd[145421]: group added to /etc/group: name=libvirt, GID=42473
Nov 29 06:33:10 compute-1 groupadd[145421]: group added to /etc/gshadow: name=libvirt
Nov 29 06:33:10 compute-1 groupadd[145421]: new group: name=libvirt, GID=42473
Nov 29 06:33:10 compute-1 sudo[145418]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:10.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:10 compute-1 sudo[145451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:33:10 compute-1 sudo[145451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:33:10 compute-1 sudo[145451]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:11 compute-1 sudo[145476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:33:11 compute-1 sudo[145476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:33:11 compute-1 sudo[145476]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:33:11 compute-1 ceph-mon[80754]: pgmap v622: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:33:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:11.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:11 compute-1 sudo[145626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrtnmqoqmuwywqmtqprbdwzraiuwbcla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397991.079843-905-101523309380626/AnsiballZ_user.py'
Nov 29 06:33:11 compute-1 sudo[145626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:11 compute-1 python3.9[145628]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:33:11 compute-1 useradd[145630]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 29 06:33:11 compute-1 sudo[145626]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:12.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:13.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:13 compute-1 ceph-mon[80754]: pgmap v623: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:14.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:15 compute-1 podman[145661]: 2025-11-29 06:33:15.306411434 +0000 UTC m=+0.054754439 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 06:33:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:15.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:15 compute-1 ceph-mon[80754]: pgmap v624: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:33:15.892 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:33:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:33:15.893 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:33:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:33:15.894 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:33:16 compute-1 sudo[145806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vurjtnugrbtkjtsglptujmlinykaakoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397995.7612555-938-198678427817728/AnsiballZ_setup.py'
Nov 29 06:33:16 compute-1 sudo[145806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:16 compute-1 python3.9[145808]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:33:16 compute-1 sudo[145806]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:16.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:16 compute-1 sudo[145890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwcgeqrtyucaugcncibbdylqytckxoro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397995.7612555-938-198678427817728/AnsiballZ_dnf.py'
Nov 29 06:33:16 compute-1 sudo[145890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:17 compute-1 python3.9[145892]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:33:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:17.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:18 compute-1 ceph-mon[80754]: pgmap v625: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:18 compute-1 sshd-session[145894]: Received disconnect from 93.157.248.178 port 33604:11: Bye Bye [preauth]
Nov 29 06:33:18 compute-1 sshd-session[145894]: Disconnected from authenticating user root 93.157.248.178 port 33604 [preauth]
Nov 29 06:33:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:18.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:19 compute-1 ceph-mon[80754]: pgmap v626: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:19 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:19.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:20.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:21 compute-1 ceph-mon[80754]: pgmap v627: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:21.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:22.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:23.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:24 compute-1 ceph-mon[80754]: pgmap v628: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:24 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:24.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:25.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:25 compute-1 ceph-mon[80754]: pgmap v629: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:26.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:27 compute-1 ceph-mon[80754]: pgmap v630: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:27.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:28.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:29.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:29 compute-1 ceph-mon[80754]: pgmap v631: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:30.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:31.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:31 compute-1 ceph-mon[80754]: pgmap v632: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:32.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:33.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:34 compute-1 ceph-mon[80754]: pgmap v633: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:34 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:34.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:35 compute-1 ceph-mon[80754]: pgmap v634: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:35.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:36.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:37.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:37 compute-1 ceph-mon[80754]: pgmap v635: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:38 compute-1 ceph-mon[80754]: pgmap v636: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:38.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:39 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:39.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:40 compute-1 podman[146086]: 2025-11-29 06:33:40.444473231 +0000 UTC m=+0.153030676 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:33:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:41.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:41.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:42 compute-1 ceph-mon[80754]: pgmap v637: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:43.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:43 compute-1 ceph-mon[80754]: pgmap v638: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:43.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:44 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:44 compute-1 sshd-session[146113]: Invalid user desliga from 66.94.122.234 port 36158
Nov 29 06:33:44 compute-1 sshd-session[146113]: Received disconnect from 66.94.122.234 port 36158:11: Bye Bye [preauth]
Nov 29 06:33:44 compute-1 sshd-session[146113]: Disconnected from invalid user desliga 66.94.122.234 port 36158 [preauth]
Nov 29 06:33:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:45.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:45.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:46 compute-1 ceph-mon[80754]: pgmap v639: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:46 compute-1 podman[146115]: 2025-11-29 06:33:46.378374293 +0000 UTC m=+0.108509279 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:33:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:47.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:47.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:47 compute-1 ceph-mon[80754]: pgmap v640: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000052s ======
Nov 29 06:33:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:49.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Nov 29 06:33:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:49.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:49 compute-1 ceph-mon[80754]: pgmap v641: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:49 compute-1 sshd-session[146085]: error: kex_exchange_identification: read: Connection timed out
Nov 29 06:33:49 compute-1 sshd-session[146085]: banner exchange: Connection from 119.45.242.7 port 38092: Connection timed out
Nov 29 06:33:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:51.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:51 compute-1 ceph-mon[80754]: pgmap v642: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:51 compute-1 kernel: SELinux:  Converting 2769 SID table entries...
Nov 29 06:33:51 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:33:51 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:33:51 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:33:51 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:33:51 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:33:51 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:33:51 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:33:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:33:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:51.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:33:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:53.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:53.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:53 compute-1 ceph-mon[80754]: pgmap v643: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:55.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:55.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:55 compute-1 ceph-mon[80754]: pgmap v644: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:56 compute-1 sshd-session[146142]: Invalid user rstudio from 71.70.164.48 port 44610
Nov 29 06:33:56 compute-1 sshd-session[146142]: Received disconnect from 71.70.164.48 port 44610:11: Bye Bye [preauth]
Nov 29 06:33:56 compute-1 sshd-session[146142]: Disconnected from invalid user rstudio 71.70.164.48 port 44610 [preauth]
Nov 29 06:33:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:57.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:57.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:33:58 compute-1 ceph-mon[80754]: pgmap v645: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:33:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:59.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:33:59 compute-1 ceph-mon[80754]: pgmap v646: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:33:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:33:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:33:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:33:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:59.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:01.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:01.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:01 compute-1 ceph-mon[80754]: pgmap v647: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:01 compute-1 kernel: SELinux:  Converting 2769 SID table entries...
Nov 29 06:34:01 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:34:01 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:34:01 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:34:01 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:34:01 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:34:01 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:34:01 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:34:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:03.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:03 compute-1 ceph-mon[80754]: pgmap v648: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:03.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:05.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:05.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:05 compute-1 ceph-mon[80754]: pgmap v649: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:07.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:34:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:07.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:34:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:09.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:09 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:09 compute-1 ceph-mon[80754]: pgmap v650: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:09.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:10 compute-1 ceph-mon[80754]: pgmap v651: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:10 compute-1 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 29 06:34:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:11.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:11 compute-1 podman[146151]: 2025-11-29 06:34:11.114541316 +0000 UTC m=+0.101668396 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:34:11 compute-1 sudo[146177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:11 compute-1 sudo[146177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:11 compute-1 sudo[146177]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:11 compute-1 sudo[146202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:34:11 compute-1 sudo[146202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:11 compute-1 sudo[146202]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:11 compute-1 sudo[146227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:11 compute-1 sudo[146227]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:11 compute-1 sudo[146227]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:11 compute-1 sudo[146252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:34:11 compute-1 sudo[146252]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:11.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:11 compute-1 sudo[146252]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:12 compute-1 ceph-mon[80754]: pgmap v652: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:13.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:34:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:34:13 compute-1 ceph-mon[80754]: pgmap v653: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:34:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:34:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:34:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:34:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:13.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:15.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:15.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:34:15.894 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:34:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:34:15.895 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:34:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:34:15.896 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:34:16 compute-1 ceph-mon[80754]: pgmap v654: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:17.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:17 compute-1 podman[147736]: 2025-11-29 06:34:17.325650732 +0000 UTC m=+0.062002758 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 29 06:34:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:17.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:18 compute-1 ceph-mon[80754]: pgmap v655: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:19.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:19 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:19.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:20 compute-1 ceph-mon[80754]: pgmap v656: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:21.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:21 compute-1 ceph-mon[80754]: pgmap v657: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:21.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:23.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:23.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:24 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:25.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:25.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:27.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:27.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:28 compute-1 ceph-mon[80754]: pgmap v658: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:29.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:29 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:29.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:30 compute-1 ceph-mon[80754]: pgmap v659: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:30 compute-1 ceph-mon[80754]: pgmap v660: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:30 compute-1 ceph-mon[80754]: pgmap v661: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:31.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:31.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:33.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:33.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:34 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:34 compute-1 ceph-mon[80754]: pgmap v662: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:35.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:35.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:37.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:37 compute-1 ceph-mon[80754]: pgmap v663: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:37 compute-1 ceph-mon[80754]: pgmap v664: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:37.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:39.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:39 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:39.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:40 compute-1 ceph-mon[80754]: pgmap v665: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:41.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:41 compute-1 podman[161643]: 2025-11-29 06:34:41.333141989 +0000 UTC m=+0.073070027 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 29 06:34:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:41.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:42 compute-1 ceph-mon[80754]: pgmap v666: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:42 compute-1 sudo[162831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:34:42 compute-1 sudo[162831]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:42 compute-1 sudo[162831]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:42 compute-1 sudo[162900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:34:42 compute-1 sudo[162900]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:34:42 compute-1 sudo[162900]: pam_unix(sudo:session): session closed for user root
Nov 29 06:34:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:43.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:43.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:44 compute-1 ceph-mon[80754]: pgmap v667: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:44 compute-1 ceph-mon[80754]: pgmap v668: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:44 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:34:44 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:34:44 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:45.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:45.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:47.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:47.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:47 compute-1 podman[163192]: 2025-11-29 06:34:47.609924941 +0000 UTC m=+0.051519056 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:34:48 compute-1 ceph-mon[80754]: pgmap v669: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:49.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:49 compute-1 ceph-mon[80754]: pgmap v670: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:49 compute-1 ceph-mon[80754]: pgmap v671: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:49.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:51.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:51 compute-1 ceph-mon[80754]: pgmap v672: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:51.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:53.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:53.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:53 compute-1 ceph-mon[80754]: pgmap v673: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:34:55 compute-1 ceph-mon[80754]: pgmap v674: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:55.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:55.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:55 compute-1 ceph-mgr[81116]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 06:34:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:57.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:57.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:58 compute-1 ceph-mon[80754]: pgmap v675: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:34:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:34:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:59.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:34:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:34:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:34:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:59.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:34:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:01.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:01.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:02 compute-1 kernel: SELinux:  Converting 2770 SID table entries...
Nov 29 06:35:02 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:35:02 compute-1 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:35:02 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:35:02 compute-1 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:35:02 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:35:02 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:35:02 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:35:02 compute-1 ceph-mon[80754]: pgmap v676: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:03.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:03 compute-1 ceph-mon[80754]: pgmap v677: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:03 compute-1 ceph-mon[80754]: pgmap v678: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:03.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:05.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:05.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:07.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:07.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:08 compute-1 ceph-mon[80754]: pgmap v679: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:08 compute-1 sshd-session[163229]: Invalid user free from 93.157.248.178 port 56104
Nov 29 06:35:08 compute-1 sshd-session[163229]: Received disconnect from 93.157.248.178 port 56104:11: Bye Bye [preauth]
Nov 29 06:35:08 compute-1 sshd-session[163229]: Disconnected from invalid user free 93.157.248.178 port 56104 [preauth]
Nov 29 06:35:08 compute-1 groupadd[163233]: group added to /etc/group: name=dnsmasq, GID=992
Nov 29 06:35:08 compute-1 groupadd[163233]: group added to /etc/gshadow: name=dnsmasq
Nov 29 06:35:08 compute-1 groupadd[163233]: new group: name=dnsmasq, GID=992
Nov 29 06:35:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:09.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:09 compute-1 useradd[163241]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 29 06:35:09 compute-1 ceph-mon[80754]: pgmap v680: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:09.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:09 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:10 compute-1 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 06:35:10 compute-1 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 29 06:35:10 compute-1 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 06:35:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:11.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:11.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:11 compute-1 ceph-mon[80754]: pgmap v681: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:12 compute-1 podman[163251]: 2025-11-29 06:35:12.380265062 +0000 UTC m=+0.107428720 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:35:13 compute-1 ceph-mon[80754]: pgmap v682: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:13.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:13 compute-1 groupadd[163281]: group added to /etc/group: name=clevis, GID=991
Nov 29 06:35:13 compute-1 groupadd[163281]: group added to /etc/gshadow: name=clevis
Nov 29 06:35:13 compute-1 groupadd[163281]: new group: name=clevis, GID=991
Nov 29 06:35:13 compute-1 useradd[163288]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 29 06:35:13 compute-1 usermod[163298]: add 'clevis' to group 'tss'
Nov 29 06:35:13 compute-1 usermod[163298]: add 'clevis' to shadow group 'tss'
Nov 29 06:35:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:13.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:14 compute-1 ceph-mon[80754]: pgmap v683: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:15.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:15 compute-1 ceph-mon[80754]: pgmap v684: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:15.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:15 compute-1 polkitd[43499]: Reloading rules
Nov 29 06:35:15 compute-1 polkitd[43499]: Collecting garbage unconditionally...
Nov 29 06:35:15 compute-1 polkitd[43499]: Loading rules from directory /etc/polkit-1/rules.d
Nov 29 06:35:15 compute-1 polkitd[43499]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 29 06:35:15 compute-1 polkitd[43499]: Finished loading, compiling and executing 3 rules
Nov 29 06:35:15 compute-1 polkitd[43499]: Reloading rules
Nov 29 06:35:15 compute-1 polkitd[43499]: Collecting garbage unconditionally...
Nov 29 06:35:15 compute-1 polkitd[43499]: Loading rules from directory /etc/polkit-1/rules.d
Nov 29 06:35:15 compute-1 polkitd[43499]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 29 06:35:15 compute-1 polkitd[43499]: Finished loading, compiling and executing 3 rules
Nov 29 06:35:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:35:15.900 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:35:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:35:15.902 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:35:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:35:15.902 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:35:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000052s ======
Nov 29 06:35:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:17.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Nov 29 06:35:17 compute-1 groupadd[163485]: group added to /etc/group: name=ceph, GID=167
Nov 29 06:35:17 compute-1 groupadd[163485]: group added to /etc/gshadow: name=ceph
Nov 29 06:35:17 compute-1 groupadd[163485]: new group: name=ceph, GID=167
Nov 29 06:35:17 compute-1 useradd[163491]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 29 06:35:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:17.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:18 compute-1 podman[163498]: 2025-11-29 06:35:18.330138183 +0000 UTC m=+0.067244920 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 06:35:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:19.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:19.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:19 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:20 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Nov 29 06:35:20 compute-1 sshd[1007]: Received signal 15; terminating.
Nov 29 06:35:20 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Nov 29 06:35:20 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Nov 29 06:35:20 compute-1 systemd[1]: sshd.service: Consumed 6.695s CPU time, read 32.0K from disk, written 184.0K to disk.
Nov 29 06:35:20 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Nov 29 06:35:20 compute-1 systemd[1]: Stopping sshd-keygen.target...
Nov 29 06:35:20 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 06:35:20 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 06:35:20 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 06:35:20 compute-1 systemd[1]: Reached target sshd-keygen.target.
Nov 29 06:35:20 compute-1 systemd[1]: Starting OpenSSH server daemon...
Nov 29 06:35:20 compute-1 sshd[164115]: Server listening on 0.0.0.0 port 22.
Nov 29 06:35:20 compute-1 sshd[164115]: Server listening on :: port 22.
Nov 29 06:35:20 compute-1 systemd[1]: Started OpenSSH server daemon.
Nov 29 06:35:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:21.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:21.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:21 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:35:22 compute-1 ceph-mon[80754]: pgmap v685: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:22 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:35:22 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:35:23 compute-1 systemd[1]: Reloading.
Nov 29 06:35:23 compute-1 systemd-rc-local-generator[164373]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:35:23 compute-1 systemd-sysv-generator[164377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:35:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:23.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:23 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:35:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:23.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:25 compute-1 ceph-mon[80754]: pgmap v686: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:25 compute-1 ceph-mon[80754]: pgmap v687: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:25 compute-1 ceph-mon[80754]: pgmap v688: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:25.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:25.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:25 compute-1 sudo[145890]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:26 compute-1 ceph-mon[80754]: pgmap v689: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:35:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:27.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:35:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:27.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:27 compute-1 ceph-mon[80754]: pgmap v690: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:29.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:29.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:30 compute-1 ceph-mon[80754]: pgmap v691: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:31 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:35:31 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:35:31 compute-1 systemd[1]: man-db-cache-update.service: Consumed 10.043s CPU time.
Nov 29 06:35:31 compute-1 systemd[1]: run-r802f4fe76eb0417b87be99fe7d8cb287.service: Deactivated successfully.
Nov 29 06:35:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:31.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:31 compute-1 ceph-mon[80754]: pgmap v692: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:31.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:33 compute-1 sshd-session[172795]: Invalid user solv from 80.94.92.182 port 51324
Nov 29 06:35:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:33.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:33 compute-1 sshd-session[172795]: Connection closed by invalid user solv 80.94.92.182 port 51324 [preauth]
Nov 29 06:35:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:33.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:34 compute-1 ceph-mon[80754]: pgmap v693: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:35.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:35.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:35 compute-1 ceph-mon[80754]: pgmap v694: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:37.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:37.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:38 compute-1 ceph-mon[80754]: pgmap v695: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:39.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:39.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:41.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:41.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:43.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:43 compute-1 sudo[172797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:35:43 compute-1 sudo[172797]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:43 compute-1 sudo[172797]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:43 compute-1 sudo[172828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:35:43 compute-1 sudo[172828]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:43 compute-1 sudo[172828]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:43 compute-1 podman[172821]: 2025-11-29 06:35:43.381743552 +0000 UTC m=+0.132987968 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:35:43 compute-1 sudo[172868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:35:43 compute-1 sudo[172868]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:43 compute-1 sudo[172868]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:43 compute-1 sudo[172899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:35:43 compute-1 sudo[172899]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:35:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:43.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:43 compute-1 ceph-mon[80754]: pgmap v696: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:44 compute-1 sudo[172899]: pam_unix(sudo:session): session closed for user root
Nov 29 06:35:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:45.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:45.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:47.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:47.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:48 compute-1 ceph-mon[80754]: pgmap v697: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:48 compute-1 ceph-mon[80754]: pgmap v698: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:35:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:49.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:49 compute-1 podman[172955]: 2025-11-29 06:35:49.337624214 +0000 UTC m=+0.072144309 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 06:35:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:49.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:51.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:51.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:51 compute-1 ceph-mon[80754]: pgmap v699: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:35:51 compute-1 ceph-mon[80754]: pgmap v700: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:51 compute-1 ceph-mon[80754]: pgmap v701: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:51 compute-1 ceph-mon[80754]: pgmap v702: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:35:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:35:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:53.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:53 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:35:53 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:35:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:53.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:35:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:55.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:55.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:56 compute-1 ceph-mon[80754]: pgmap v703: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:35:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:57.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:35:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:57.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:35:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:59.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:35:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:35:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:35:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:59.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:01.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:01.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:01 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:36:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:03.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:03.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:04 compute-1 ceph-mon[80754]: pgmap v704: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:04 compute-1 ceph-mon[80754]: pgmap v705: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:04 compute-1 ceph-mon[80754]: pgmap v706: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:04 compute-1 ceph-mon[80754]: pgmap v707: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:05.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:05.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:07.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:07.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:09.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:09.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:11.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:11 compute-1 ceph-mon[80754]: pgmap v708: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:11 compute-1 ceph-mon[80754]: pgmap v709: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:11.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:36:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:13.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:36:13 compute-1 ceph-mon[80754]: pgmap v710: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:13 compute-1 ceph-mon[80754]: pgmap v711: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:13 compute-1 ceph-mon[80754]: pgmap v712: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:13.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:14 compute-1 podman[172975]: 2025-11-29 06:36:14.392881587 +0000 UTC m=+0.121730016 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 29 06:36:15 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:15.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:15 compute-1 ceph-mon[80754]: pgmap v713: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:15.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:36:15.902 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:36:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:36:15.903 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:36:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:36:15.903 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:36:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:17.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:17 compute-1 sshd-session[173001]: Invalid user terraria from 71.70.164.48 port 43971
Nov 29 06:36:17 compute-1 sshd-session[173001]: Received disconnect from 71.70.164.48 port 43971:11: Bye Bye [preauth]
Nov 29 06:36:17 compute-1 sshd-session[173001]: Disconnected from invalid user terraria 71.70.164.48 port 43971 [preauth]
Nov 29 06:36:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:17.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:18 compute-1 ceph-mon[80754]: pgmap v714: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:19.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:19.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:20 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:20 compute-1 ceph-mon[80754]: pgmap v715: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:20 compute-1 ceph-mon[80754]: pgmap v716: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:20 compute-1 podman[173005]: 2025-11-29 06:36:20.335166107 +0000 UTC m=+0.072062158 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 06:36:20 compute-1 sshd-session[173003]: Invalid user odoo15 from 93.157.248.178 port 36540
Nov 29 06:36:20 compute-1 sshd-session[173003]: Received disconnect from 93.157.248.178 port 36540:11: Bye Bye [preauth]
Nov 29 06:36:20 compute-1 sshd-session[173003]: Disconnected from invalid user odoo15 93.157.248.178 port 36540 [preauth]
Nov 29 06:36:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:21.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:21.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:23.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:23.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:25.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.004000104s ======
Nov 29 06:36:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:25.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000104s
Nov 29 06:36:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:27.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:27.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:29.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:29 compute-1 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 06:36:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:29.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:30 compute-1 ceph-mon[80754]: pgmap v717: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:31.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:31.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:33 compute-1 ceph-mon[80754]: pgmap v718: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:33 compute-1 ceph-mon[80754]: pgmap v719: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:33 compute-1 ceph-mon[80754]: pgmap v720: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:33 compute-1 ceph-mon[80754]: pgmap v721: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:33 compute-1 ceph-mon[80754]: pgmap v722: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:33.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:33.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:34 compute-1 ceph-mon[80754]: pgmap v723: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:35.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:35.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:37 compute-1 ceph-mon[80754]: pgmap v724: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:37.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:37.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:39.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:39.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:40 compute-1 ceph-mon[80754]: pgmap v725: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:40 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:36:40 compute-1 sudo[173150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqwfakczsytldptlpactaiaftyklakzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398200.2830312-974-141003346012002/AnsiballZ_systemd.py'
Nov 29 06:36:40 compute-1 sudo[173150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:40 compute-1 sudo[173153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:36:40 compute-1 sudo[173153]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:36:40 compute-1 sudo[173153]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:41 compute-1 sudo[173178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:36:41 compute-1 sudo[173178]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:36:41 compute-1 sudo[173178]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:41 compute-1 python3.9[173152]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:41 compute-1 systemd[1]: Reloading.
Nov 29 06:36:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:41.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:41 compute-1 systemd-sysv-generator[173236]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:41 compute-1 systemd-rc-local-generator[173231]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:41 compute-1 sudo[173150]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:41.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:41 compute-1 ceph-mon[80754]: pgmap v726: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:41 compute-1 ceph-mon[80754]: pgmap v727: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:41 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:36:42 compute-1 sudo[173390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brbgsgszfbeybfplhvalzozfvmiwdfyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398201.7144723-974-79735775628837/AnsiballZ_systemd.py'
Nov 29 06:36:42 compute-1 sudo[173390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:42 compute-1 python3.9[173392]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:42 compute-1 systemd[1]: Reloading.
Nov 29 06:36:42 compute-1 systemd-sysv-generator[173425]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:42 compute-1 systemd-rc-local-generator[173421]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:42 compute-1 sudo[173390]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:43 compute-1 sudo[173580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyfktnhgihzoqfbqxqejspygvchrpcuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398202.8953066-974-280860552508741/AnsiballZ_systemd.py'
Nov 29 06:36:43 compute-1 sudo[173580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:43.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:43 compute-1 python3.9[173582]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:43 compute-1 systemd[1]: Reloading.
Nov 29 06:36:43 compute-1 systemd-rc-local-generator[173606]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:43 compute-1 systemd-sysv-generator[173613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:43.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:43 compute-1 sudo[173580]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:43 compute-1 ceph-mon[80754]: pgmap v728: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:44 compute-1 sudo[173770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogngwtzivtxhccmymmyggetxfaecimtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398204.074685-974-164405684782695/AnsiballZ_systemd.py'
Nov 29 06:36:44 compute-1 sudo[173770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:44 compute-1 podman[173772]: 2025-11-29 06:36:44.619305999 +0000 UTC m=+0.171274901 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:36:44 compute-1 python3.9[173773]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:44 compute-1 systemd[1]: Reloading.
Nov 29 06:36:44 compute-1 systemd-rc-local-generator[173833]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:44 compute-1 systemd-sysv-generator[173836]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:45 compute-1 sudo[173770]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:45.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:45.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:45 compute-1 ceph-mon[80754]: pgmap v729: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:47 compute-1 ceph-mon[80754]: pgmap v730: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:47 compute-1 sudo[173988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkcnwigujyyqubdnsccngegmzuehyxyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398206.8771315-1064-276862065384753/AnsiballZ_systemd.py'
Nov 29 06:36:47 compute-1 sudo[173988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:47.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:47 compute-1 python3.9[173990]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:47 compute-1 systemd[1]: Reloading.
Nov 29 06:36:47 compute-1 systemd-rc-local-generator[174019]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:47 compute-1 systemd-sysv-generator[174023]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:47.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:47 compute-1 sudo[173988]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:48 compute-1 sudo[174178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsizravaersdggvztusnxywuvzganpxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398208.0553029-1064-91504156264251/AnsiballZ_systemd.py'
Nov 29 06:36:48 compute-1 sudo[174178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:48 compute-1 python3.9[174180]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:48 compute-1 systemd[1]: Reloading.
Nov 29 06:36:48 compute-1 systemd-rc-local-generator[174213]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:48 compute-1 systemd-sysv-generator[174218]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:49 compute-1 sudo[174178]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:49.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:49 compute-1 sudo[174369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmoqnradvauwdjrnsscgftgyomemzgmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398209.2621632-1064-156280074546801/AnsiballZ_systemd.py'
Nov 29 06:36:49 compute-1 sudo[174369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:49.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:49 compute-1 python3.9[174371]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:50 compute-1 systemd[1]: Reloading.
Nov 29 06:36:50 compute-1 ceph-mon[80754]: pgmap v731: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:50 compute-1 systemd-rc-local-generator[174396]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:50 compute-1 systemd-sysv-generator[174403]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:50 compute-1 sudo[174369]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:50 compute-1 podman[174409]: 2025-11-29 06:36:50.481142868 +0000 UTC m=+0.072525251 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 06:36:50 compute-1 sudo[174578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnbgeumzbhaioyeqnkoognfrqxmmgfuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398210.5876217-1064-198855142758894/AnsiballZ_systemd.py'
Nov 29 06:36:50 compute-1 sudo[174578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:51 compute-1 python3.9[174580]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:51 compute-1 ceph-mon[80754]: pgmap v732: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:51.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:51 compute-1 sudo[174578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:51.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:51 compute-1 sudo[174733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdgsuarvbttfkoyljccrdcpmluybgdzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398211.4792051-1064-234981524553375/AnsiballZ_systemd.py'
Nov 29 06:36:51 compute-1 sudo[174733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:52 compute-1 python3.9[174735]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:52 compute-1 systemd[1]: Reloading.
Nov 29 06:36:52 compute-1 systemd-sysv-generator[174767]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:52 compute-1 systemd-rc-local-generator[174763]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:52 compute-1 sudo[174733]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:53.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:53 compute-1 sudo[174922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unrygjuiasjqnfqrgaxpdeiyfbpuggxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398213.2589967-1172-199939941675933/AnsiballZ_systemd.py'
Nov 29 06:36:53 compute-1 sudo[174922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:53.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:53 compute-1 python3.9[174924]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:54 compute-1 systemd[1]: Reloading.
Nov 29 06:36:54 compute-1 systemd-rc-local-generator[174946]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:54 compute-1 systemd-sysv-generator[174952]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:54 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 29 06:36:54 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 29 06:36:54 compute-1 sudo[174922]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:36:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:55.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:36:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:55.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:36:56 compute-1 ceph-mon[80754]: pgmap v733: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:56 compute-1 sudo[175114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhnxkorkmcpbrfxnldnuensbnzylvgja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398216.3618484-1196-95087691163176/AnsiballZ_systemd.py'
Nov 29 06:36:56 compute-1 sudo[175114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:57 compute-1 python3.9[175116]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:57 compute-1 sudo[175114]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:57.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:57 compute-1 ceph-mon[80754]: pgmap v734: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:57 compute-1 ceph-mon[80754]: pgmap v735: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.473165) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217473425, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2441, "num_deletes": 251, "total_data_size": 6349396, "memory_usage": 6431848, "flush_reason": "Manual Compaction"}
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217517600, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 4155641, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10584, "largest_seqno": 13020, "table_properties": {"data_size": 4145723, "index_size": 6348, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19629, "raw_average_key_size": 20, "raw_value_size": 4125981, "raw_average_value_size": 4210, "num_data_blocks": 284, "num_entries": 980, "num_filter_entries": 980, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397891, "oldest_key_time": 1764397891, "file_creation_time": 1764398217, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 44501 microseconds, and 16931 cpu microseconds.
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.517663) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 4155641 bytes OK
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.517689) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.519870) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.519896) EVENT_LOG_v1 {"time_micros": 1764398217519889, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.519920) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 6338934, prev total WAL file size 6338934, number of live WAL files 2.
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.522453) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(4058KB)], [21(9323KB)]
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217522652, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 13703122, "oldest_snapshot_seqno": -1}
Nov 29 06:36:57 compute-1 sudo[175269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcvtdbcropxbbjfdffshnffjaiubleht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398217.2850814-1196-37734102905718/AnsiballZ_systemd.py'
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4464 keys, 10609978 bytes, temperature: kUnknown
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217611716, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 10609978, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10575161, "index_size": 22547, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11205, "raw_key_size": 109199, "raw_average_key_size": 24, "raw_value_size": 10489542, "raw_average_value_size": 2349, "num_data_blocks": 972, "num_entries": 4464, "num_filter_entries": 4464, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398217, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:36:57 compute-1 sudo[175269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.612042) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 10609978 bytes
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.613653) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.7 rd, 119.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 9.1 +0.0 blob) out(10.1 +0.0 blob), read-write-amplify(5.9) write-amplify(2.6) OK, records in: 4982, records dropped: 518 output_compression: NoCompression
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.613674) EVENT_LOG_v1 {"time_micros": 1764398217613661, "job": 10, "event": "compaction_finished", "compaction_time_micros": 89169, "compaction_time_cpu_micros": 34863, "output_level": 6, "num_output_files": 1, "total_output_size": 10609978, "num_input_records": 4982, "num_output_records": 4464, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217614488, "job": 10, "event": "table_file_deletion", "file_number": 23}
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217616635, "job": 10, "event": "table_file_deletion", "file_number": 21}
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.522321) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.616741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.616749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.616751) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.616753) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.616755) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:36:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:57.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:57 compute-1 python3.9[175271]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:57 compute-1 sudo[175269]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:58 compute-1 sudo[175424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrcuzhtaqulyqiwiryybijfyeevigavs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398218.1380346-1196-143504500182919/AnsiballZ_systemd.py'
Nov 29 06:36:58 compute-1 sudo[175424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:58 compute-1 python3.9[175426]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:58 compute-1 sudo[175424]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:59.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:59 compute-1 sudo[175579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oequfmhhgjxiejheufgqasujazfcvhnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398219.1099932-1196-269932533840974/AnsiballZ_systemd.py'
Nov 29 06:36:59 compute-1 sudo[175579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:59 compute-1 python3.9[175581]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:36:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:36:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:59.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:36:59 compute-1 sudo[175579]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:00 compute-1 sudo[175734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cshajtiirihkezzrdleakcwwojimafle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398220.0759127-1196-105108063337245/AnsiballZ_systemd.py'
Nov 29 06:37:00 compute-1 sudo[175734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:00 compute-1 python3.9[175736]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:00 compute-1 sudo[175734]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:01 compute-1 sudo[175889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trdcghpphgcttxtqlrnnsapxgphdtivs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398220.9665186-1196-126591634386526/AnsiballZ_systemd.py'
Nov 29 06:37:01 compute-1 sudo[175889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:01.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:01 compute-1 python3.9[175891]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:01 compute-1 sudo[175889]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:01.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:01 compute-1 ceph-mon[80754]: pgmap v736: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:02 compute-1 sudo[176044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azeznevevnrvaqhmbwirxzlwkikfuldf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398221.8029528-1196-78082220509989/AnsiballZ_systemd.py'
Nov 29 06:37:02 compute-1 sudo[176044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:02 compute-1 python3.9[176046]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:02 compute-1 sudo[176044]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:02 compute-1 sudo[176199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kivraunirbeqcsvugaimhlhprjkcyqyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398222.6225736-1196-178108695081996/AnsiballZ_systemd.py'
Nov 29 06:37:02 compute-1 sudo[176199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:03 compute-1 python3.9[176201]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:03 compute-1 sudo[176199]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:03.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:03 compute-1 ceph-mon[80754]: pgmap v737: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:03 compute-1 sudo[176354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waenhyquzbxwdtxyvfzapzkcwagixlsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398223.4402256-1196-218567327667158/AnsiballZ_systemd.py'
Nov 29 06:37:03 compute-1 sudo[176354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:03.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:04 compute-1 python3.9[176356]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:04 compute-1 sudo[176354]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:04 compute-1 sudo[176509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmctifgbsrrlvwrxeglsezuuhqfneflt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398224.2893631-1196-207178814728125/AnsiballZ_systemd.py'
Nov 29 06:37:04 compute-1 sudo[176509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:04 compute-1 ceph-mon[80754]: pgmap v738: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:04 compute-1 python3.9[176511]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:04 compute-1 sudo[176509]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:05.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:05 compute-1 sudo[176664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnvmpslwktreggtbflzyolxlmjmrzetv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398225.1824062-1196-235885299093029/AnsiballZ_systemd.py'
Nov 29 06:37:05 compute-1 sudo[176664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:05 compute-1 python3.9[176666]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:05.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:05 compute-1 sudo[176664]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:06 compute-1 ceph-mon[80754]: pgmap v739: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:06 compute-1 sudo[176819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lahuowdbwgiughwdokufbhjusqksutov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398225.9821208-1196-70304283585851/AnsiballZ_systemd.py'
Nov 29 06:37:06 compute-1 sudo[176819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:06 compute-1 python3.9[176821]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:06 compute-1 sudo[176819]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:07 compute-1 sudo[176974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeapqqticxejqbffwnlehtbtasjsdaez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398226.9513767-1196-240023066575998/AnsiballZ_systemd.py'
Nov 29 06:37:07 compute-1 sudo[176974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:07.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:07 compute-1 ceph-mon[80754]: pgmap v740: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:07 compute-1 python3.9[176976]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:07 compute-1 sudo[176974]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:07.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:08 compute-1 sudo[177129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhykugzsblcpfyjuisahwdvdrjipxysg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398227.8177533-1196-184675125882570/AnsiballZ_systemd.py'
Nov 29 06:37:08 compute-1 sudo[177129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:08 compute-1 python3.9[177131]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:37:08 compute-1 sudo[177129]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:09.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:09.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:10 compute-1 ceph-mon[80754]: pgmap v741: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:10 compute-1 sudo[177286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsfbmoifsekqtbsauxskwrlwacckosjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398230.671212-1502-64368049852155/AnsiballZ_file.py'
Nov 29 06:37:10 compute-1 sudo[177286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:11 compute-1 python3.9[177288]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:11 compute-1 sudo[177286]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:11 compute-1 sshd-session[177159]: Invalid user sinusbot from 119.45.242.7 port 52932
Nov 29 06:37:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:11.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:11 compute-1 sudo[177438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkivcejvxbccrdhtpllipcnncavpcian ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398231.3493733-1502-98708749560610/AnsiballZ_file.py'
Nov 29 06:37:11 compute-1 sudo[177438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:11.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:11 compute-1 python3.9[177440]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:11 compute-1 sudo[177438]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:12 compute-1 sshd-session[177159]: Received disconnect from 119.45.242.7 port 52932:11: Bye Bye [preauth]
Nov 29 06:37:12 compute-1 sshd-session[177159]: Disconnected from invalid user sinusbot 119.45.242.7 port 52932 [preauth]
Nov 29 06:37:12 compute-1 sudo[177590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqpukkpcqlvrzmccbyipuchmtxcwmfom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398232.074701-1502-165169109416537/AnsiballZ_file.py'
Nov 29 06:37:12 compute-1 sudo[177590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:12 compute-1 python3.9[177592]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:12 compute-1 sudo[177590]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:13 compute-1 sudo[177742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwkstkszaavnvbrpelknepgeigpqsdsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398232.7845073-1502-264187423908717/AnsiballZ_file.py'
Nov 29 06:37:13 compute-1 sudo[177742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:13 compute-1 python3.9[177744]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:13 compute-1 sudo[177742]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:13.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:13 compute-1 ceph-mon[80754]: pgmap v742: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:13 compute-1 sudo[177894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doaeipkyayufrxpxxpkhbropjtdjlvhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398233.418433-1502-240254745804520/AnsiballZ_file.py'
Nov 29 06:37:13 compute-1 sudo[177894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:13.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:13 compute-1 python3.9[177896]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:13 compute-1 sudo[177894]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:14 compute-1 sudo[178046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejcippqvbkuisromckvfocizaxgrajrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398234.0640333-1502-86019531920846/AnsiballZ_file.py'
Nov 29 06:37:14 compute-1 sudo[178046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:14 compute-1 ceph-mon[80754]: pgmap v743: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:14 compute-1 python3.9[178048]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:37:14 compute-1 sudo[178046]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:15 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:15.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:15 compute-1 podman[178148]: 2025-11-29 06:37:15.394397965 +0000 UTC m=+0.129918206 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 29 06:37:15 compute-1 sudo[178224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjqobrkdnnklqxintiqhdnllwijjonjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398234.9247022-1631-272444055300555/AnsiballZ_stat.py'
Nov 29 06:37:15 compute-1 sudo[178224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:15 compute-1 python3.9[178226]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:15 compute-1 ceph-mon[80754]: pgmap v744: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:15 compute-1 sudo[178224]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:15.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:37:15.903 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:37:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:37:15.904 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:37:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:37:15.905 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:37:16 compute-1 sudo[178350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvblvrtlpiearrrffkmzqwjpysqmgxty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398234.9247022-1631-272444055300555/AnsiballZ_copy.py'
Nov 29 06:37:16 compute-1 sudo[178350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:16 compute-1 python3.9[178352]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398234.9247022-1631-272444055300555/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:16 compute-1 sudo[178350]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:16 compute-1 sudo[178504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcsugjxoxveiefxyedhgfaqbfzwqoldg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398236.3639603-1631-86036261312485/AnsiballZ_stat.py'
Nov 29 06:37:16 compute-1 sudo[178504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:16 compute-1 python3.9[178506]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:16 compute-1 sudo[178504]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:17 compute-1 sudo[178629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cblrqvijwiloedubgyprmjpibagmbsgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398236.3639603-1631-86036261312485/AnsiballZ_copy.py'
Nov 29 06:37:17 compute-1 sudo[178629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:17.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:17 compute-1 python3.9[178631]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398236.3639603-1631-86036261312485/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:17 compute-1 sudo[178629]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:17.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:17 compute-1 sudo[178781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfegkqunlzpscygywsyxngefxlfyvoui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398237.6124299-1631-266298084345523/AnsiballZ_stat.py'
Nov 29 06:37:17 compute-1 sudo[178781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:17 compute-1 sshd-session[178400]: Received disconnect from 66.94.122.234 port 38564:11: Bye Bye [preauth]
Nov 29 06:37:17 compute-1 sshd-session[178400]: Disconnected from authenticating user root 66.94.122.234 port 38564 [preauth]
Nov 29 06:37:18 compute-1 python3.9[178783]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:18 compute-1 sudo[178781]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:18 compute-1 ceph-mon[80754]: pgmap v745: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:18 compute-1 sudo[178906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fktqhqglgafjehqwaapqwuvldcvtbarq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398237.6124299-1631-266298084345523/AnsiballZ_copy.py'
Nov 29 06:37:18 compute-1 sudo[178906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:18 compute-1 python3.9[178908]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398237.6124299-1631-266298084345523/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:18 compute-1 sudo[178906]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:19 compute-1 sudo[179058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koxtjidrrfaimegkpbyiribzoggcxjzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398238.8239045-1631-178566270045473/AnsiballZ_stat.py'
Nov 29 06:37:19 compute-1 sudo[179058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:19 compute-1 python3.9[179060]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:19 compute-1 sudo[179058]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:19.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:19 compute-1 sudo[179183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecxugildbgolmrnluyhfksoxknzsidth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398238.8239045-1631-178566270045473/AnsiballZ_copy.py'
Nov 29 06:37:19 compute-1 sudo[179183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:19 compute-1 ceph-mon[80754]: pgmap v746: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:19.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:19 compute-1 python3.9[179185]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398238.8239045-1631-178566270045473/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:19 compute-1 sudo[179183]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:20 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:20 compute-1 sudo[179335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avqprqilwawsuhasuadapfbdamjpjblg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398240.1058025-1631-220718953633725/AnsiballZ_stat.py'
Nov 29 06:37:20 compute-1 sudo[179335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:20 compute-1 python3.9[179337]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:20 compute-1 sudo[179335]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:21 compute-1 sudo[179471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eydmiwxzkdvuoaguwazaiuppfmhixzft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398240.1058025-1631-220718953633725/AnsiballZ_copy.py'
Nov 29 06:37:21 compute-1 sudo[179471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:21 compute-1 podman[179434]: 2025-11-29 06:37:21.296622574 +0000 UTC m=+0.092095604 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:37:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:37:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:21.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:37:21 compute-1 python3.9[179477]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398240.1058025-1631-220718953633725/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:21 compute-1 sudo[179471]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:21 compute-1 ceph-mon[80754]: pgmap v747: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:21.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:21 compute-1 sudo[179630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcxqbjefwoeqhdevxfxyrnihjiesxqrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398241.6541767-1631-251541886994568/AnsiballZ_stat.py'
Nov 29 06:37:21 compute-1 sudo[179630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:22 compute-1 python3.9[179632]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:22 compute-1 sudo[179630]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:22 compute-1 sudo[179755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wesydwcuaztmhgxpolmiwamwtgkznrqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398241.6541767-1631-251541886994568/AnsiballZ_copy.py'
Nov 29 06:37:22 compute-1 sudo[179755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:22 compute-1 python3.9[179757]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398241.6541767-1631-251541886994568/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:22 compute-1 sudo[179755]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:23 compute-1 sudo[179907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-infdyupyoxxmqgnffsnigqfangzxvkvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398242.8746698-1631-227206292202920/AnsiballZ_stat.py'
Nov 29 06:37:23 compute-1 sudo[179907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:23 compute-1 python3.9[179909]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:23.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:23 compute-1 sudo[179907]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:23 compute-1 sudo[180030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivathhknaorzfzsnlxuomezjsirxnbrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398242.8746698-1631-227206292202920/AnsiballZ_copy.py'
Nov 29 06:37:23 compute-1 sudo[180030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:24.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:25 compute-1 ceph-mon[80754]: pgmap v748: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:25 compute-1 python3.9[180032]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398242.8746698-1631-227206292202920/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:25 compute-1 sudo[180030]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:25.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:25 compute-1 sudo[180183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdpxobrgogyedrtnntiwvmcolqhgjlmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398245.2563784-1631-76949557507197/AnsiballZ_stat.py'
Nov 29 06:37:25 compute-1 sudo[180183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:25 compute-1 python3.9[180185]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:25 compute-1 sudo[180183]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:26 compute-1 sudo[180308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzsakbutuolkzdzylaiozzpxcxpiuamf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398245.2563784-1631-76949557507197/AnsiballZ_copy.py'
Nov 29 06:37:26 compute-1 sudo[180308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:26 compute-1 python3.9[180310]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398245.2563784-1631-76949557507197/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:26 compute-1 sudo[180308]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:26.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:26 compute-1 sudo[180460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjvbgaekbofkevztfenybpgmebfqdolh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398246.6462448-1970-55341134990252/AnsiballZ_command.py'
Nov 29 06:37:26 compute-1 sudo[180460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:27 compute-1 python3.9[180462]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 29 06:37:27 compute-1 sudo[180460]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:27.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:28.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:29.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:30 compute-1 sudo[180615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvabrnqptleslepsxllwhbyxkslegfer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398250.5105278-1997-142951996210738/AnsiballZ_file.py'
Nov 29 06:37:30 compute-1 sudo[180615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:30.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:31 compute-1 python3.9[180617]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:31 compute-1 sudo[180615]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:31 compute-1 ceph-mon[80754]: pgmap v749: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:31.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:31 compute-1 sudo[180767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axqprqfesaltiwebvsdmjaataypgupso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398251.2502978-1997-15689580948209/AnsiballZ_file.py'
Nov 29 06:37:31 compute-1 sudo[180767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:31 compute-1 sshd-session[180585]: Received disconnect from 93.157.248.178 port 54142:11: Bye Bye [preauth]
Nov 29 06:37:31 compute-1 sshd-session[180585]: Disconnected from authenticating user root 93.157.248.178 port 54142 [preauth]
Nov 29 06:37:31 compute-1 python3.9[180769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:31 compute-1 sudo[180767]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:32 compute-1 sudo[180919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pddpihvzkxbqibkwcjmieoieiuaxqqsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398251.9149337-1997-121015276720501/AnsiballZ_file.py'
Nov 29 06:37:32 compute-1 sudo[180919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:32 compute-1 python3.9[180921]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:32 compute-1 sudo[180919]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:32.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:32 compute-1 sudo[181071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gllokylohpgkkkvbrbqxguocqyosfuwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398252.6097052-1997-198178589384500/AnsiballZ_file.py'
Nov 29 06:37:32 compute-1 sudo[181071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:33 compute-1 python3.9[181073]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:33 compute-1 sudo[181071]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:33.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:33 compute-1 sudo[181223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iitwamsiadrsnclltwbzwhwwfgayawhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398253.3062227-1997-219750804978380/AnsiballZ_file.py'
Nov 29 06:37:33 compute-1 sudo[181223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:33 compute-1 python3.9[181225]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:33 compute-1 sudo[181223]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:34 compute-1 sudo[181375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvsqvqtelfegknvmhnhuupghxrrvpice ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398254.0193553-1997-192480852934645/AnsiballZ_file.py'
Nov 29 06:37:34 compute-1 sudo[181375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:34 compute-1 python3.9[181377]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:34 compute-1 sudo[181375]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:34.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:34 compute-1 sudo[181527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfawipjgortajfxxbaypuabqhztsrflb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398254.6801498-1997-72081526549142/AnsiballZ_file.py'
Nov 29 06:37:34 compute-1 sudo[181527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:35 compute-1 python3.9[181529]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:35 compute-1 sudo[181527]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:35.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:35 compute-1 sudo[181679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pckfvqnzpmznwkekgailfydmnqkbafkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398255.3612719-1997-218443449246189/AnsiballZ_file.py'
Nov 29 06:37:35 compute-1 sudo[181679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:35 compute-1 ceph-mon[80754]: pgmap v750: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:35 compute-1 ceph-mon[80754]: pgmap v751: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:35 compute-1 ceph-mon[80754]: pgmap v752: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:35 compute-1 python3.9[181681]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:35 compute-1 sudo[181679]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:36 compute-1 sudo[181831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bstplgshtjbutakfdunbtfysvdcbbvri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398256.0285957-1997-187494910472248/AnsiballZ_file.py'
Nov 29 06:37:36 compute-1 sudo[181831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:36 compute-1 python3.9[181833]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:36 compute-1 sudo[181831]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:36.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:36 compute-1 sudo[181983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akhfwboisxwexhgfcbcypzlbkadnduut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398256.6889434-1997-275736891340595/AnsiballZ_file.py'
Nov 29 06:37:36 compute-1 sudo[181983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:37 compute-1 python3.9[181985]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:37 compute-1 sudo[181983]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:37.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:37 compute-1 sudo[182135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gakbgqmykpneahqyrlmsdfkfuzwhjeti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398257.3181999-1997-16909438414595/AnsiballZ_file.py'
Nov 29 06:37:37 compute-1 sudo[182135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:37 compute-1 python3.9[182137]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:37 compute-1 sudo[182135]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:38 compute-1 sudo[182287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wupqwzncrcusxehfuavekilevnndjtkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398257.9632337-1997-147861365649900/AnsiballZ_file.py'
Nov 29 06:37:38 compute-1 sudo[182287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:38 compute-1 python3.9[182289]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:38 compute-1 sudo[182287]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:38 compute-1 ceph-mon[80754]: pgmap v753: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:38 compute-1 ceph-mon[80754]: pgmap v754: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:38.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:38 compute-1 sudo[182439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlogwefcnlwefpetqxfxvfgccfzkytxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398258.6117604-1997-99449261029525/AnsiballZ_file.py'
Nov 29 06:37:38 compute-1 sudo[182439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:39 compute-1 python3.9[182441]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:39 compute-1 sudo[182439]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:39.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:39 compute-1 sudo[182591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgushuhwlqkemainvmvzmnwmbvmnhqqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398259.3602438-1997-261455128522804/AnsiballZ_file.py'
Nov 29 06:37:39 compute-1 sudo[182591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:39 compute-1 python3.9[182593]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:39 compute-1 sudo[182591]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:40 compute-1 ceph-mon[80754]: pgmap v755: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:40 compute-1 ceph-mon[80754]: pgmap v756: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:40.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:41 compute-1 sudo[182618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:41 compute-1 sudo[182618]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:41 compute-1 sudo[182618]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:41 compute-1 sudo[182643]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:37:41 compute-1 sudo[182643]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:41 compute-1 sudo[182643]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:41 compute-1 sudo[182668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:41 compute-1 sudo[182668]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:41 compute-1 sudo[182668]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:41.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:41 compute-1 sudo[182693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:37:41 compute-1 sudo[182693]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:41 compute-1 sudo[182916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rswgzbbwodnhlbodblxojrkscwvzyfbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398261.6088867-2294-79249116156935/AnsiballZ_stat.py'
Nov 29 06:37:41 compute-1 sudo[182916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:42 compute-1 podman[182922]: 2025-11-29 06:37:42.042558225 +0000 UTC m=+0.077726469 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Nov 29 06:37:42 compute-1 python3.9[182921]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:42 compute-1 sudo[182916]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:42 compute-1 podman[182922]: 2025-11-29 06:37:42.150519243 +0000 UTC m=+0.185687457 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Nov 29 06:37:42 compute-1 sudo[183149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ialvfanuaeghnzrpqfpeyztvuyxojpew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398261.6088867-2294-79249116156935/AnsiballZ_copy.py'
Nov 29 06:37:42 compute-1 sudo[183149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:42 compute-1 sudo[182693]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:42 compute-1 python3.9[183153]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398261.6088867-2294-79249116156935/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:42 compute-1 sudo[183149]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:42.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:42 compute-1 ceph-mon[80754]: pgmap v757: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:43 compute-1 sudo[183318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbsvxogbqsnicykxpameelnimgluxoye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398262.894948-2294-2001492489449/AnsiballZ_stat.py'
Nov 29 06:37:43 compute-1 sudo[183318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:43 compute-1 sudo[183321]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:43 compute-1 sudo[183321]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:43 compute-1 sudo[183321]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:43 compute-1 sudo[183346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:37:43 compute-1 sudo[183346]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:43 compute-1 python3.9[183320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:43 compute-1 sudo[183346]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:43 compute-1 sudo[183318]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:43.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:43 compute-1 sudo[183371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:37:43 compute-1 sudo[183371]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:43 compute-1 sudo[183371]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:43 compute-1 sudo[183419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:37:43 compute-1 sudo[183419]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:37:43 compute-1 sudo[183553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nislcnukesafhhgekritbyhbotfeehpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398262.894948-2294-2001492489449/AnsiballZ_copy.py'
Nov 29 06:37:43 compute-1 sudo[183553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:43 compute-1 python3.9[183555]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398262.894948-2294-2001492489449/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:43 compute-1 sudo[183553]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:43 compute-1 sudo[183419]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:44 compute-1 sudo[183722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eakpqzczkrngtybnbezimsbnahavqyxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398264.1167877-2294-131465536107601/AnsiballZ_stat.py'
Nov 29 06:37:44 compute-1 sudo[183722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:44 compute-1 python3.9[183724]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:44 compute-1 sudo[183722]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:44.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:45 compute-1 sudo[183845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbgsctnwfrsrpkxwszqzshwxrhdhrncz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398264.1167877-2294-131465536107601/AnsiballZ_copy.py'
Nov 29 06:37:45 compute-1 sudo[183845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:45 compute-1 python3.9[183847]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398264.1167877-2294-131465536107601/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:45 compute-1 sudo[183845]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:45 compute-1 ceph-mon[80754]: pgmap v758: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:45 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:37:45 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:37:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:45.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:45 compute-1 sudo[184010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymfnkhldwtnadizcdkjqkhofbbcfekgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398265.3699958-2294-47270268184019/AnsiballZ_stat.py'
Nov 29 06:37:45 compute-1 sudo[184010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:45 compute-1 podman[183971]: 2025-11-29 06:37:45.767582501 +0000 UTC m=+0.136953904 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 06:37:45 compute-1 python3.9[184016]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:45 compute-1 sudo[184010]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:46 compute-1 sudo[184146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edvrvssbyziabtllxalvxbhtkxepcbaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398265.3699958-2294-47270268184019/AnsiballZ_copy.py'
Nov 29 06:37:46 compute-1 sudo[184146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:46 compute-1 python3.9[184148]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398265.3699958-2294-47270268184019/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:46 compute-1 sudo[184146]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:46.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:47 compute-1 sudo[184298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ughikktkevwceytonqkqdytjwbbyvouy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398266.6962094-2294-211728434380370/AnsiballZ_stat.py'
Nov 29 06:37:47 compute-1 sudo[184298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:47 compute-1 python3.9[184300]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:47 compute-1 sudo[184298]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:47.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:47 compute-1 sudo[184421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykerpcnoyigfkmhmejpuctutoeetfltb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398266.6962094-2294-211728434380370/AnsiballZ_copy.py'
Nov 29 06:37:47 compute-1 sudo[184421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:47 compute-1 ceph-mon[80754]: pgmap v759: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:37:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:37:47 compute-1 python3.9[184423]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398266.6962094-2294-211728434380370/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:47 compute-1 sudo[184421]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:48 compute-1 sudo[184573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvmyjwmbgxrdoqaihuzgkermevyqlvyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398268.1008565-2294-13675310506462/AnsiballZ_stat.py'
Nov 29 06:37:48 compute-1 sudo[184573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:48 compute-1 python3.9[184575]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:48 compute-1 sudo[184573]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:48.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:48 compute-1 ceph-mon[80754]: pgmap v760: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:37:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:37:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:37:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:37:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:37:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:37:49 compute-1 sudo[184696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxdfhggpppcnzibhrdgjfysxszzshdou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398268.1008565-2294-13675310506462/AnsiballZ_copy.py'
Nov 29 06:37:49 compute-1 sudo[184696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:49 compute-1 python3.9[184698]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398268.1008565-2294-13675310506462/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:49 compute-1 sudo[184696]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:49.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:49 compute-1 sudo[184848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bilmbkwdzsizeqtektqdysjwdnejsdgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398269.4193218-2294-119258112079153/AnsiballZ_stat.py'
Nov 29 06:37:49 compute-1 sudo[184848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:49 compute-1 python3.9[184850]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:49 compute-1 sudo[184848]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:49 compute-1 ceph-mon[80754]: pgmap v761: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:50 compute-1 sudo[184971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqyleskkzpvkafxzvhynekweufpdedmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398269.4193218-2294-119258112079153/AnsiballZ_copy.py'
Nov 29 06:37:50 compute-1 sudo[184971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:50 compute-1 python3.9[184973]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398269.4193218-2294-119258112079153/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:50 compute-1 sudo[184971]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:50.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:51 compute-1 sudo[185123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyxttnoosxadqrqbutonfujoclppbgjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398270.6600578-2294-241243882900888/AnsiballZ_stat.py'
Nov 29 06:37:51 compute-1 sudo[185123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:51 compute-1 python3.9[185125]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:51 compute-1 sudo[185123]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:51.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:51 compute-1 sudo[185259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxzenurzafmywqgcetwtzvufjwtisasq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398270.6600578-2294-241243882900888/AnsiballZ_copy.py'
Nov 29 06:37:51 compute-1 sudo[185259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:51 compute-1 podman[185220]: 2025-11-29 06:37:51.63602868 +0000 UTC m=+0.069419567 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 06:37:51 compute-1 python3.9[185265]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398270.6600578-2294-241243882900888/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:51 compute-1 sudo[185259]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:52 compute-1 ceph-mon[80754]: pgmap v762: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:52 compute-1 sudo[185415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzsjsixgvwjxzgbddramazcdxwobtmso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398272.0413673-2294-224655406299635/AnsiballZ_stat.py'
Nov 29 06:37:52 compute-1 sudo[185415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:52 compute-1 python3.9[185417]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:52 compute-1 sudo[185415]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:52 compute-1 sudo[185538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jljvjqvpejfujhnelqkpyhzjfqonrxyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398272.0413673-2294-224655406299635/AnsiballZ_copy.py'
Nov 29 06:37:52 compute-1 sudo[185538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:37:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:52.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:37:53 compute-1 python3.9[185540]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398272.0413673-2294-224655406299635/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:53 compute-1 sudo[185538]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:53 compute-1 ceph-mon[80754]: pgmap v763: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:53.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:53 compute-1 sudo[185690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piuqewkpulvsdlrixfmdaesjqfnbphho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398273.2643209-2294-171697872700200/AnsiballZ_stat.py'
Nov 29 06:37:53 compute-1 sudo[185690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:53 compute-1 python3.9[185692]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:53 compute-1 sudo[185690]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:54 compute-1 sudo[185813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxekmskshsbaairilrjmbzwvxbvtczdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398273.2643209-2294-171697872700200/AnsiballZ_copy.py'
Nov 29 06:37:54 compute-1 sudo[185813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:54 compute-1 python3.9[185815]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398273.2643209-2294-171697872700200/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:54 compute-1 sudo[185813]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:54.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:55 compute-1 sudo[185965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eogtdgiifsqwsjdrhrrggdpyxlnewobh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398274.712713-2294-73325298030304/AnsiballZ_stat.py'
Nov 29 06:37:55 compute-1 sudo[185965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:55.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:37:55 compute-1 python3.9[185967]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:55 compute-1 sudo[185965]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:56 compute-1 sudo[186088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrlgfnhwilrmygaagwhkwtxtitrmywom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398274.712713-2294-73325298030304/AnsiballZ_copy.py'
Nov 29 06:37:56 compute-1 sudo[186088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:56 compute-1 python3.9[186090]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398274.712713-2294-73325298030304/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:56 compute-1 sudo[186088]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:56.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:57 compute-1 sudo[186240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnpziotfastewpryqfhsnxjnvhmgbjtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398276.7182071-2294-162280638203457/AnsiballZ_stat.py'
Nov 29 06:37:57 compute-1 sudo[186240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:57 compute-1 python3.9[186242]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:57 compute-1 sudo[186240]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:37:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:57.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:37:57 compute-1 sudo[186363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abzekjjbqfxhjwqgiwacctsnypobcecn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398276.7182071-2294-162280638203457/AnsiballZ_copy.py'
Nov 29 06:37:57 compute-1 sudo[186363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:57 compute-1 python3.9[186365]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398276.7182071-2294-162280638203457/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:57 compute-1 sudo[186363]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:58 compute-1 ceph-mon[80754]: pgmap v764: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:37:58 compute-1 sudo[186515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhxwqdjqbbbriirmfvazbszyflwsujid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398278.1190464-2294-37751764207498/AnsiballZ_stat.py'
Nov 29 06:37:58 compute-1 sudo[186515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:58 compute-1 python3.9[186517]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:58 compute-1 sudo[186515]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:58 compute-1 sudo[186638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdxzrggktxrbkjpnddqqoxdtnlkydlzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398278.1190464-2294-37751764207498/AnsiballZ_copy.py'
Nov 29 06:37:58 compute-1 sudo[186638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:58.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:59 compute-1 python3.9[186640]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398278.1190464-2294-37751764207498/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:59 compute-1 sudo[186638]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:37:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:37:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:59.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:37:59 compute-1 sudo[186790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvxtjscdwrqziaepquxnhnengsqgvqbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398279.3315175-2294-19475005926558/AnsiballZ_stat.py'
Nov 29 06:37:59 compute-1 sudo[186790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:59 compute-1 python3.9[186792]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:59 compute-1 sudo[186790]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:00 compute-1 sudo[186913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvuwjfwtudeufswumvsevzwadvvmnopo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398279.3315175-2294-19475005926558/AnsiballZ_copy.py'
Nov 29 06:38:00 compute-1 sudo[186913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:00 compute-1 ceph-mon[80754]: pgmap v765: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:00 compute-1 python3.9[186915]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398279.3315175-2294-19475005926558/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:00 compute-1 sudo[186913]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:00.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:01.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:01 compute-1 python3.9[187066]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:02.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:03 compute-1 sudo[187219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcnywxaobmwfemxsoyhegcozkbtfkiyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398282.6124265-2912-148749379372534/AnsiballZ_seboolean.py'
Nov 29 06:38:03 compute-1 sudo[187219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:03.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:03 compute-1 python3.9[187221]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 29 06:38:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:04.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:05 compute-1 sudo[187219]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:05.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:05 compute-1 ceph-mon[80754]: pgmap v766: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:05 compute-1 ceph-mon[80754]: pgmap v767: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:06 compute-1 sshd-session[186940]: Connection closed by 119.45.242.7 port 35460 [preauth]
Nov 29 06:38:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:06.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:07 compute-1 sudo[187251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:38:07 compute-1 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 29 06:38:07 compute-1 sudo[187251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:38:07 compute-1 sudo[187251]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:07 compute-1 sudo[187276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:38:07 compute-1 sudo[187276]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:38:07 compute-1 sudo[187276]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:07.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:07 compute-1 ceph-mon[80754]: pgmap v768: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:07 compute-1 ceph-mon[80754]: pgmap v769: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:38:08 compute-1 sudo[187426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofahgnvnqgtynfkxhfmkjkpzcsjmvdpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398288.1715224-2936-15615734556224/AnsiballZ_copy.py'
Nov 29 06:38:08 compute-1 sudo[187426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:08 compute-1 python3.9[187428]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:08 compute-1 sudo[187426]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:08.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:09 compute-1 sudo[187578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwljzaxccsndjwogfdxkzhblydzqaydt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398288.7988968-2936-23452609142770/AnsiballZ_copy.py'
Nov 29 06:38:09 compute-1 sudo[187578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:09 compute-1 python3.9[187580]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:09 compute-1 sudo[187578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:09.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:09 compute-1 sudo[187730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agdkvykvjjhrzfxgrxnumfqmflwkgsbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398289.5552425-2936-127685678173585/AnsiballZ_copy.py'
Nov 29 06:38:09 compute-1 sudo[187730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:10 compute-1 python3.9[187732]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:10 compute-1 sudo[187730]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:10 compute-1 sudo[187882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjdprlrdfelkrldgdfbwomgtxzwnxtdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398290.4144585-2936-227756591092586/AnsiballZ_copy.py'
Nov 29 06:38:10 compute-1 sudo[187882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:10 compute-1 python3.9[187884]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:10 compute-1 sudo[187882]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:10.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:11 compute-1 sudo[188034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frijebvajqbtsvoycpfjmmozmzhqytqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398291.0774963-2936-94040590029936/AnsiballZ_copy.py'
Nov 29 06:38:11 compute-1 sudo[188034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:11 compute-1 ceph-mon[80754]: pgmap v770: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:38:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:11.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:11 compute-1 python3.9[188036]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:11 compute-1 sudo[188034]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:12 compute-1 sudo[188186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fumcureommyvusgwjifntwxgwyirgekz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398291.9780383-3044-37085422536830/AnsiballZ_copy.py'
Nov 29 06:38:12 compute-1 sudo[188186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:12 compute-1 python3.9[188188]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:12 compute-1 sudo[188186]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:12 compute-1 auditd[701]: Audit daemon rotating log files
Nov 29 06:38:12 compute-1 ceph-mon[80754]: pgmap v771: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:12 compute-1 ceph-mon[80754]: pgmap v772: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:12 compute-1 sudo[188338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqeuhwyzpuofwuighrvoxgmzdmcgpsmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398292.6454575-3044-14048929246634/AnsiballZ_copy.py'
Nov 29 06:38:12 compute-1 sudo[188338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:12.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:13 compute-1 python3.9[188340]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:13 compute-1 sudo[188338]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:13.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:13 compute-1 sudo[188490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yziwegepeswtgmyukemmpwfnrgfjygzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398293.30076-3044-103052638380040/AnsiballZ_copy.py'
Nov 29 06:38:13 compute-1 sudo[188490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:13 compute-1 python3.9[188492]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:13 compute-1 sudo[188490]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:14 compute-1 sudo[188642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukuyiwdxzvbuboiivphlbpdnaoegxuhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398293.9811482-3044-98076211979084/AnsiballZ_copy.py'
Nov 29 06:38:14 compute-1 sudo[188642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:14 compute-1 ceph-mon[80754]: pgmap v773: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:14 compute-1 python3.9[188644]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:14 compute-1 sudo[188642]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:14.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:15 compute-1 sudo[188794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywdzjspsdjlfkyzdsyvukkynxnpxxmyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398294.7507217-3044-259034963384820/AnsiballZ_copy.py'
Nov 29 06:38:15 compute-1 sudo[188794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:15 compute-1 python3.9[188796]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:15 compute-1 sudo[188794]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:15.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:15 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:38:15.904 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:38:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:38:15.906 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:38:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:38:15.907 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:38:16 compute-1 ceph-mon[80754]: pgmap v774: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:16 compute-1 sudo[188955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyvtnjwatqujeoyytmgqjsucdtkslhli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398295.708229-3152-257635688667690/AnsiballZ_systemd.py'
Nov 29 06:38:16 compute-1 sudo[188955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:16 compute-1 podman[188920]: 2025-11-29 06:38:16.10482321 +0000 UTC m=+0.122306001 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 06:38:16 compute-1 python3.9[188965]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:38:16 compute-1 systemd[1]: Reloading.
Nov 29 06:38:16 compute-1 systemd-rc-local-generator[189003]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:16 compute-1 systemd-sysv-generator[189006]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:16 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Nov 29 06:38:16 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Nov 29 06:38:16 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 29 06:38:16 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 29 06:38:16 compute-1 systemd[1]: Starting libvirt logging daemon...
Nov 29 06:38:16 compute-1 systemd[1]: Started libvirt logging daemon.
Nov 29 06:38:16 compute-1 sudo[188955]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:16.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:17 compute-1 ceph-mon[80754]: pgmap v775: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:17 compute-1 sudo[189166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldlpqkcxmxyxtjagbkmosjsmumglrjkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398296.9521194-3152-39650673655612/AnsiballZ_systemd.py'
Nov 29 06:38:17 compute-1 sudo[189166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:17.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:17 compute-1 python3.9[189168]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:38:17 compute-1 systemd[1]: Reloading.
Nov 29 06:38:17 compute-1 systemd-rc-local-generator[189197]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:17 compute-1 systemd-sysv-generator[189202]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:17 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 29 06:38:17 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 29 06:38:17 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 29 06:38:17 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 29 06:38:17 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 29 06:38:17 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 29 06:38:17 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 06:38:18 compute-1 systemd[1]: Started libvirt nodedev daemon.
Nov 29 06:38:18 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 29 06:38:18 compute-1 sudo[189166]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:18 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 29 06:38:18 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 29 06:38:18 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 29 06:38:18 compute-1 sudo[189391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bssuenzdamiccdohiezghbrvtkbcfcfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398298.1997652-3152-226287935331254/AnsiballZ_systemd.py'
Nov 29 06:38:18 compute-1 sudo[189391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:18 compute-1 python3.9[189393]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:38:18 compute-1 systemd[1]: Reloading.
Nov 29 06:38:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:18.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:19 compute-1 systemd-sysv-generator[189425]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:19 compute-1 systemd-rc-local-generator[189422]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:19 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 29 06:38:19 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 29 06:38:19 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 29 06:38:19 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 29 06:38:19 compute-1 systemd[1]: Starting libvirt proxy daemon...
Nov 29 06:38:19 compute-1 systemd[1]: Started libvirt proxy daemon.
Nov 29 06:38:19 compute-1 sudo[189391]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:19.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:19 compute-1 setroubleshoot[189232]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l e72e4fab-4474-481d-9f3c-44e6530a45b6
Nov 29 06:38:19 compute-1 setroubleshoot[189232]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 29 06:38:19 compute-1 setroubleshoot[189232]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l e72e4fab-4474-481d-9f3c-44e6530a45b6
Nov 29 06:38:19 compute-1 setroubleshoot[189232]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 29 06:38:19 compute-1 sudo[189605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utppzzoqgbnppkcqyqcoisqvmojcszeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398299.4986067-3152-73133848175339/AnsiballZ_systemd.py'
Nov 29 06:38:19 compute-1 sudo[189605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:20 compute-1 python3.9[189607]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:38:20 compute-1 systemd[1]: Reloading.
Nov 29 06:38:20 compute-1 systemd-rc-local-generator[189633]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:20 compute-1 systemd-sysv-generator[189636]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:20 compute-1 ceph-mon[80754]: pgmap v776: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:20 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Nov 29 06:38:20 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 29 06:38:20 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 29 06:38:20 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 29 06:38:20 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 29 06:38:20 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 29 06:38:20 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 29 06:38:20 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 29 06:38:20 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 29 06:38:20 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 29 06:38:20 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 06:38:20 compute-1 systemd[1]: Started libvirt QEMU daemon.
Nov 29 06:38:20 compute-1 sudo[189605]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:20 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:20.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:21.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:21 compute-1 sudo[189821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzqnscomnrujjxtpqmbvkbydjavgsylg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398300.9969454-3152-48934364682867/AnsiballZ_systemd.py'
Nov 29 06:38:21 compute-1 sudo[189821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:21 compute-1 ceph-mon[80754]: pgmap v777: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:21 compute-1 python3.9[189823]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:38:21 compute-1 systemd[1]: Reloading.
Nov 29 06:38:21 compute-1 podman[189825]: 2025-11-29 06:38:21.933678317 +0000 UTC m=+0.070404614 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:38:21 compute-1 systemd-rc-local-generator[189870]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:21 compute-1 systemd-sysv-generator[189874]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:22 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Nov 29 06:38:22 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Nov 29 06:38:22 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 29 06:38:22 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 29 06:38:22 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 29 06:38:22 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 29 06:38:22 compute-1 systemd[1]: Starting libvirt secret daemon...
Nov 29 06:38:22 compute-1 systemd[1]: Started libvirt secret daemon.
Nov 29 06:38:22 compute-1 sudo[189821]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:22.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:23.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:24 compute-1 ceph-mon[80754]: pgmap v778: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:24 compute-1 sudo[190052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olsnptihcafzszcwnjplkcmqmbhhgczs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398304.2810793-3264-74052427964056/AnsiballZ_file.py'
Nov 29 06:38:24 compute-1 sudo[190052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:24 compute-1 python3.9[190054]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:24 compute-1 sudo[190052]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:24.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:25.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:25 compute-1 sudo[190204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlpxyvdaukvkrrdjbvuqliuingyosxrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398305.2048483-3287-242380849295865/AnsiballZ_find.py'
Nov 29 06:38:25 compute-1 sudo[190204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:25 compute-1 python3.9[190206]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:38:25 compute-1 sudo[190204]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:26 compute-1 sudo[190356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eriyasjeyfjtutarahxpeojjaugjunvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398306.1405091-3311-104139709252346/AnsiballZ_command.py'
Nov 29 06:38:26 compute-1 sudo[190356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:26 compute-1 ceph-mon[80754]: pgmap v779: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:26 compute-1 python3.9[190358]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:26 compute-1 sudo[190356]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:26.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:27.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:27 compute-1 ceph-mon[80754]: pgmap v780: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:28 compute-1 python3.9[190512]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:38:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:28.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:29 compute-1 python3.9[190662]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:29.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:29 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 29 06:38:29 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.095s CPU time.
Nov 29 06:38:29 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 29 06:38:30 compute-1 python3.9[190785]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398308.7800333-3368-16976233971827/.source.xml follow=False _original_basename=secret.xml.j2 checksum=63744b3abb892aaab98ed7226f328ffc66ff66bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:30 compute-1 ceph-mon[80754]: pgmap v781: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:30 compute-1 sshd-session[190745]: Received disconnect from 71.70.164.48 port 43548:11: Bye Bye [preauth]
Nov 29 06:38:30 compute-1 sshd-session[190745]: Disconnected from authenticating user root 71.70.164.48 port 43548 [preauth]
Nov 29 06:38:30 compute-1 sudo[190935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpjgfioflzopupfiwsnjjlctvdkgaepy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398310.3172214-3413-94540117916333/AnsiballZ_command.py'
Nov 29 06:38:30 compute-1 sudo[190935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:30 compute-1 python3.9[190937]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 336ec58c-893b-528f-a0c1-6ed1196bc047
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:30 compute-1 polkitd[43499]: Registered Authentication Agent for unix-process:190939:377528 (system bus name :1.1854 [pkttyagent --process 190939 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Nov 29 06:38:30 compute-1 polkitd[43499]: Unregistered Authentication Agent for unix-process:190939:377528 (system bus name :1.1854, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Nov 29 06:38:30 compute-1 polkitd[43499]: Registered Authentication Agent for unix-process:190938:377527 (system bus name :1.1855 [pkttyagent --process 190938 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Nov 29 06:38:30 compute-1 polkitd[43499]: Unregistered Authentication Agent for unix-process:190938:377527 (system bus name :1.1855, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Nov 29 06:38:30 compute-1 sudo[190935]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:30.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:31 compute-1 ceph-mon[80754]: pgmap v782: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:31.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:31 compute-1 python3.9[191099]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:32 compute-1 sudo[191249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzygbotvfjutgysujhjniaculkleufxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398311.982327-3461-18581903620193/AnsiballZ_command.py'
Nov 29 06:38:32 compute-1 sudo[191249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:32 compute-1 sudo[191249]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:32.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:33 compute-1 sudo[191404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezitpsypeptbpxtplknovjijvvbuvimy ; FSID=336ec58c-893b-528f-a0c1-6ed1196bc047 KEY=AQCBjyppAAAAABAAXQRTF6pnk4WV7TfvJo0Mjg== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398312.792373-3485-129409490952165/AnsiballZ_command.py'
Nov 29 06:38:33 compute-1 sudo[191404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:33 compute-1 polkitd[43499]: Registered Authentication Agent for unix-process:191407:377773 (system bus name :1.1858 [pkttyagent --process 191407 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Nov 29 06:38:33 compute-1 polkitd[43499]: Unregistered Authentication Agent for unix-process:191407:377773 (system bus name :1.1858, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Nov 29 06:38:33 compute-1 sudo[191404]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:33.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:33 compute-1 ceph-mon[80754]: pgmap v783: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:33 compute-1 sshd-session[191352]: Invalid user ubuntu from 93.157.248.178 port 49138
Nov 29 06:38:33 compute-1 sudo[191562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lluloibxcovwkvwramrbszdhqnnpdkht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398313.5728412-3509-133937842473238/AnsiballZ_copy.py'
Nov 29 06:38:33 compute-1 sudo[191562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:33 compute-1 sshd-session[191352]: Received disconnect from 93.157.248.178 port 49138:11: Bye Bye [preauth]
Nov 29 06:38:33 compute-1 sshd-session[191352]: Disconnected from invalid user ubuntu 93.157.248.178 port 49138 [preauth]
Nov 29 06:38:34 compute-1 python3.9[191564]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:34 compute-1 sudo[191562]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:34 compute-1 sudo[191714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvmhosfsfqgocvctifbquyxeyugyejfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398314.448359-3533-10779246222150/AnsiballZ_stat.py'
Nov 29 06:38:34 compute-1 sudo[191714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:35.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:35 compute-1 python3.9[191716]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:35 compute-1 sudo[191714]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:35 compute-1 sudo[191837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yatqifyxvuzoqyumpnmhjssekjoxjkch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398314.448359-3533-10779246222150/AnsiballZ_copy.py'
Nov 29 06:38:35 compute-1 sudo[191837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:35.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:35 compute-1 python3.9[191839]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398314.448359-3533-10779246222150/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:35 compute-1 sudo[191837]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:35 compute-1 ceph-mon[80754]: pgmap v784: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:38:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 6284 writes, 25K keys, 6284 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 6284 writes, 1144 syncs, 5.49 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 444 writes, 711 keys, 444 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s
                                           Interval WAL: 444 writes, 204 syncs, 2.18 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Nov 29 06:38:36 compute-1 sudo[191989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jearolggsaslilfzodyktkpbryfkhvkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398316.0393782-3581-163524874069701/AnsiballZ_file.py'
Nov 29 06:38:36 compute-1 sudo[191989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:36 compute-1 python3.9[191991]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:36 compute-1 sudo[191989]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:37.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:37 compute-1 sudo[192141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlhwzmzloktgsajunjfszcerpdiyzokj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398316.777954-3605-277681127969871/AnsiballZ_stat.py'
Nov 29 06:38:37 compute-1 sudo[192141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:37 compute-1 python3.9[192143]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:37 compute-1 sudo[192141]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:37.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:37 compute-1 sudo[192219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwxetzvlsygfknsnvkziawagyxpbdnyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398316.777954-3605-277681127969871/AnsiballZ_file.py'
Nov 29 06:38:37 compute-1 sudo[192219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:37 compute-1 python3.9[192221]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:37 compute-1 sudo[192219]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:38 compute-1 ceph-mon[80754]: pgmap v785: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:38 compute-1 sudo[192371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjiojmkuoefclhxlkmtblmiievaebawt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398318.0302482-3641-242000381587053/AnsiballZ_stat.py'
Nov 29 06:38:38 compute-1 sudo[192371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:38 compute-1 python3.9[192373]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:38 compute-1 sudo[192371]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:38 compute-1 sudo[192449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygmjyselbtxnhjwxezhmkiygfbgvcoeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398318.0302482-3641-242000381587053/AnsiballZ_file.py'
Nov 29 06:38:38 compute-1 sudo[192449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:39.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:39 compute-1 python3.9[192451]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.j8b_lkih recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:39 compute-1 sudo[192449]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:39.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:39 compute-1 sudo[192601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqeqijqriezlhqcazsotzolagwwdxqgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398319.3767903-3677-117462186581472/AnsiballZ_stat.py'
Nov 29 06:38:39 compute-1 sudo[192601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:39 compute-1 python3.9[192603]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:40 compute-1 sudo[192601]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:40 compute-1 sudo[192679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njjzkyvbdzqmupfidlrokpguxzjnobpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398319.3767903-3677-117462186581472/AnsiballZ_file.py'
Nov 29 06:38:40 compute-1 sudo[192679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:40 compute-1 python3.9[192681]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:40 compute-1 sudo[192679]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:40 compute-1 ceph-mon[80754]: pgmap v786: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:41.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:41 compute-1 sudo[192831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylxvsytdtfwldbfdkydnayebfuyeyguk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398320.725163-3716-80269044163956/AnsiballZ_command.py'
Nov 29 06:38:41 compute-1 sudo[192831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:41 compute-1 python3.9[192833]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:41 compute-1 sudo[192831]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:41.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:42 compute-1 sudo[192984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byrtydsiqbxhzssvemxuumxojtkuugvw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398321.650612-3740-35933827923058/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:38:42 compute-1 sudo[192984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:42 compute-1 python3[192986]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:38:42 compute-1 sudo[192984]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:43.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:43 compute-1 sudo[193136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ollhebeewvvlhemmymvgdkqnyfqtblyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398322.9229934-3764-84327558190200/AnsiballZ_stat.py'
Nov 29 06:38:43 compute-1 sudo[193136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:43 compute-1 python3.9[193138]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:43 compute-1 sudo[193136]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:43.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:43 compute-1 sudo[193214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkatmsnyjiwjkqonmdyvddffdfqiryih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398322.9229934-3764-84327558190200/AnsiballZ_file.py'
Nov 29 06:38:43 compute-1 sudo[193214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:43 compute-1 ceph-mon[80754]: pgmap v787: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:43 compute-1 python3.9[193216]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:43 compute-1 sudo[193214]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:44 compute-1 sudo[193366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntbgrcocjhwbzmirbesuaexsemciropb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398324.3139257-3800-45521263189917/AnsiballZ_stat.py'
Nov 29 06:38:44 compute-1 sudo[193366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:44 compute-1 python3.9[193368]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:44 compute-1 sudo[193366]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:45.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:45 compute-1 ceph-mon[80754]: pgmap v788: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:45 compute-1 sudo[193444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogxmblzbajobgzgiupewxzxmmchoacqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398324.3139257-3800-45521263189917/AnsiballZ_file.py'
Nov 29 06:38:45 compute-1 sudo[193444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:45 compute-1 python3.9[193446]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:45 compute-1 sudo[193444]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:45.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:45 compute-1 sudo[193596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufkqyserltrxsydxjhxkvxsfjldgajas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398325.5915482-3836-139267712896065/AnsiballZ_stat.py'
Nov 29 06:38:45 compute-1 sudo[193596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:46 compute-1 python3.9[193598]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:46 compute-1 sudo[193596]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:46 compute-1 podman[193603]: 2025-11-29 06:38:46.394990329 +0000 UTC m=+0.119498717 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 06:38:46 compute-1 sudo[193700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxkuukzihvroivgvjhizucecdcrkpkkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398325.5915482-3836-139267712896065/AnsiballZ_file.py'
Nov 29 06:38:46 compute-1 sudo[193700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:46 compute-1 python3.9[193702]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:46 compute-1 sudo[193700]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:47.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:47 compute-1 ceph-mon[80754]: pgmap v789: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:47 compute-1 sudo[193852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yktusrsnvxnjyoxtpfekcgfbookvjgit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398327.1360743-3872-40936594893045/AnsiballZ_stat.py'
Nov 29 06:38:47 compute-1 sudo[193852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:47.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:47 compute-1 python3.9[193854]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:47 compute-1 sudo[193852]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:47 compute-1 sudo[193930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxynamsjujdrklgkqwhvxajbgzhnavgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398327.1360743-3872-40936594893045/AnsiballZ_file.py'
Nov 29 06:38:47 compute-1 sudo[193930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:48 compute-1 ceph-mon[80754]: pgmap v790: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:48 compute-1 python3.9[193932]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:48 compute-1 sudo[193930]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:48 compute-1 sudo[194082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioevytpqoshtfsicooatkyokbpxkcwkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398328.3961663-3908-210462400689457/AnsiballZ_stat.py'
Nov 29 06:38:48 compute-1 sudo[194082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:49.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:49 compute-1 python3.9[194084]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:49 compute-1 sudo[194082]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:49 compute-1 ceph-mon[80754]: pgmap v791: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:49 compute-1 sudo[194209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgdkttdpauqyrcxtefudhrafyzzpcnkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398328.3961663-3908-210462400689457/AnsiballZ_copy.py'
Nov 29 06:38:49 compute-1 sudo[194209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:49.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:49 compute-1 python3.9[194211]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398328.3961663-3908-210462400689457/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:49 compute-1 sudo[194209]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:50 compute-1 sshd-session[194100]: Invalid user sol from 80.94.92.182 port 53888
Nov 29 06:38:50 compute-1 sudo[194361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icvodfmiqkotionuajslmrbyfazxmemx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398329.9775765-3953-139418777225888/AnsiballZ_file.py'
Nov 29 06:38:50 compute-1 sudo[194361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:50 compute-1 sshd-session[194100]: Connection closed by invalid user sol 80.94.92.182 port 53888 [preauth]
Nov 29 06:38:50 compute-1 python3.9[194363]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:50 compute-1 sudo[194361]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:51.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:51 compute-1 sudo[194513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixlbyyatnbcmopqdehvtrutkaukzdlyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398330.7602267-3977-155105877038813/AnsiballZ_command.py'
Nov 29 06:38:51 compute-1 sudo[194513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:51 compute-1 python3.9[194515]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:51 compute-1 sudo[194513]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:51.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:51 compute-1 ceph-mon[80754]: pgmap v792: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:52 compute-1 sudo[194670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osacknhmlloswjqwkixoxgthiewmtifj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398331.6530848-4001-233313717097530/AnsiballZ_blockinfile.py'
Nov 29 06:38:52 compute-1 sudo[194670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:52 compute-1 podman[194672]: 2025-11-29 06:38:52.326391987 +0000 UTC m=+0.079916148 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 06:38:52 compute-1 python3.9[194673]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:52 compute-1 sudo[194670]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:53.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:53 compute-1 sudo[194841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehhoqupzdtfflmnmroiclumfphianmhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398332.9653049-4028-73561977592653/AnsiballZ_command.py'
Nov 29 06:38:53 compute-1 sudo[194841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:53 compute-1 python3.9[194843]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:53.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:53 compute-1 sudo[194841]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:53 compute-1 ceph-mon[80754]: pgmap v793: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:54 compute-1 sudo[194994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywvzikdenaxettcsfpixsbdgpnaapmsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398333.758196-4052-224984452369339/AnsiballZ_stat.py'
Nov 29 06:38:54 compute-1 sudo[194994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:54 compute-1 python3.9[194996]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:38:54 compute-1 sudo[194994]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:54 compute-1 sudo[195148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhmhwskcokkftkoxkgmrgfharqzgayfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398334.5364487-4076-213897833243468/AnsiballZ_command.py'
Nov 29 06:38:54 compute-1 sudo[195148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:55.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:55 compute-1 python3.9[195150]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:55 compute-1 sudo[195148]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:55 compute-1 sshd-session[194601]: Invalid user adsl from 119.45.242.7 port 46220
Nov 29 06:38:55 compute-1 ceph-mon[80754]: pgmap v794: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:55 compute-1 sshd-session[194601]: Received disconnect from 119.45.242.7 port 46220:11: Bye Bye [preauth]
Nov 29 06:38:55 compute-1 sshd-session[194601]: Disconnected from invalid user adsl 119.45.242.7 port 46220 [preauth]
Nov 29 06:38:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:55.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:55 compute-1 sudo[195303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyrqvamaqspunaanaljrythchpfwxbpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398335.4297652-4101-139910075700574/AnsiballZ_file.py'
Nov 29 06:38:55 compute-1 sudo[195303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:38:55 compute-1 python3.9[195305]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:55 compute-1 sudo[195303]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:56 compute-1 sudo[195455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyfroxutenbyqscndbklqoejfzrxcusc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398336.1676142-4124-17413112356666/AnsiballZ_stat.py'
Nov 29 06:38:56 compute-1 sudo[195455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:56 compute-1 python3.9[195457]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:56 compute-1 sudo[195455]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:57.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:57 compute-1 sudo[195578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gewzgdtswjbehmjdtsfwowmbyqppsipm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398336.1676142-4124-17413112356666/AnsiballZ_copy.py'
Nov 29 06:38:57 compute-1 sudo[195578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:57 compute-1 python3.9[195580]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398336.1676142-4124-17413112356666/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:57 compute-1 sudo[195578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:57.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:57 compute-1 ceph-mon[80754]: pgmap v795: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:38:58 compute-1 sudo[195730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iecwfhunepjlmcytbolhedbykiyxgnlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398337.9043186-4169-173428630325823/AnsiballZ_stat.py'
Nov 29 06:38:58 compute-1 sudo[195730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:58 compute-1 python3.9[195732]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:58 compute-1 sudo[195730]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:58 compute-1 sudo[195853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eodkjwvdvdpputqcdxkqxqqvndzirala ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398337.9043186-4169-173428630325823/AnsiballZ_copy.py'
Nov 29 06:38:58 compute-1 sudo[195853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:38:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:59.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:38:59 compute-1 python3.9[195855]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398337.9043186-4169-173428630325823/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:59 compute-1 sudo[195853]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:38:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:38:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:59.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:38:59 compute-1 sudo[196005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-houxknyoccgghiaycqjbasuemiamssxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398339.3179662-4214-44976795114830/AnsiballZ_stat.py'
Nov 29 06:38:59 compute-1 sudo[196005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:59 compute-1 python3.9[196007]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:59 compute-1 sudo[196005]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:00 compute-1 sudo[196128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inonqdbiywcocaomtjbqyyguaiayqxzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398339.3179662-4214-44976795114830/AnsiballZ_copy.py'
Nov 29 06:39:00 compute-1 sudo[196128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:00 compute-1 ceph-mon[80754]: pgmap v796: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:00 compute-1 python3.9[196130]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398339.3179662-4214-44976795114830/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:00 compute-1 sudo[196128]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:01.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:01 compute-1 sudo[196280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxfytopakiffrddajlvccthcjeijlvpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398340.970198-4259-208426912892086/AnsiballZ_systemd.py'
Nov 29 06:39:01 compute-1 sudo[196280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:01.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:01 compute-1 python3.9[196282]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:39:01 compute-1 systemd[1]: Reloading.
Nov 29 06:39:01 compute-1 systemd-rc-local-generator[196305]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:01 compute-1 systemd-sysv-generator[196308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:01 compute-1 ceph-mon[80754]: pgmap v797: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:02 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Nov 29 06:39:02 compute-1 sudo[196280]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:02 compute-1 sudo[196470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmzwxqllkbjqbbmunszsaweovrigwipj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398342.3489652-4283-62242973389294/AnsiballZ_systemd.py'
Nov 29 06:39:02 compute-1 sudo[196470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:02 compute-1 python3.9[196472]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 06:39:02 compute-1 systemd[1]: Reloading.
Nov 29 06:39:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:03.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:03 compute-1 systemd-rc-local-generator[196501]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:03 compute-1 systemd-sysv-generator[196505]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:03 compute-1 ceph-mon[80754]: pgmap v798: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:03 compute-1 systemd[1]: Reloading.
Nov 29 06:39:03 compute-1 systemd-rc-local-generator[196536]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:03 compute-1 systemd-sysv-generator[196540]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:03.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:03 compute-1 sudo[196470]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:04 compute-1 sshd-session[139367]: Connection closed by 192.168.122.30 port 45816
Nov 29 06:39:04 compute-1 sshd-session[139364]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:39:04 compute-1 systemd[1]: session-48.scope: Deactivated successfully.
Nov 29 06:39:04 compute-1 systemd[1]: session-48.scope: Consumed 3min 48.008s CPU time.
Nov 29 06:39:04 compute-1 systemd-logind[785]: Session 48 logged out. Waiting for processes to exit.
Nov 29 06:39:04 compute-1 systemd-logind[785]: Removed session 48.
Nov 29 06:39:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:05.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:05.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:06 compute-1 ceph-mon[80754]: pgmap v799: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:07.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:07 compute-1 sudo[196572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:07 compute-1 sudo[196572]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:07 compute-1 sudo[196572]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:07 compute-1 sudo[196597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:39:07 compute-1 sudo[196597]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:07 compute-1 sudo[196597]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:07 compute-1 ceph-mon[80754]: pgmap v800: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:39:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:07.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:39:07 compute-1 sudo[196622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:07 compute-1 sudo[196622]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:07 compute-1 sudo[196622]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:07 compute-1 sudo[196647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Nov 29 06:39:07 compute-1 sudo[196647]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:07 compute-1 sshd-session[196570]: Connection closed by 66.94.122.234 port 41458 [preauth]
Nov 29 06:39:07 compute-1 sudo[196647]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:09.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:09 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:09 compute-1 ceph-mon[80754]: pgmap v801: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:39:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:09.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:39:09 compute-1 sudo[196692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:09 compute-1 sudo[196692]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:09 compute-1 sudo[196692]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:09 compute-1 sudo[196717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:39:09 compute-1 sudo[196717]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:09 compute-1 sudo[196717]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:09 compute-1 sudo[196742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:09 compute-1 sudo[196742]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:09 compute-1 sudo[196742]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:09 compute-1 sudo[196767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:39:09 compute-1 sudo[196767]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:09 compute-1 sshd-session[196792]: Accepted publickey for zuul from 192.168.122.30 port 32948 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:39:09 compute-1 systemd-logind[785]: New session 49 of user zuul.
Nov 29 06:39:10 compute-1 systemd[1]: Started Session 49 of User zuul.
Nov 29 06:39:10 compute-1 sshd-session[196792]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:39:10 compute-1 sudo[196767]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:39:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:11.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:39:11 compute-1 python3.9[196977]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:39:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:39:11 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:39:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:11.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:12 compute-1 ceph-mon[80754]: pgmap v802: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:39:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:39:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:39:12 compute-1 python3.9[197131]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:39:12 compute-1 network[197148]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:39:12 compute-1 network[197149]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:39:12 compute-1 network[197150]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:39:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:13.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:13.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:13 compute-1 ceph-mon[80754]: pgmap v803: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:15.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:15.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:15 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:39:15.906 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:39:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:39:15.909 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:39:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:39:15.909 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:39:15 compute-1 ceph-mon[80754]: pgmap v804: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:17.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:17 compute-1 podman[197318]: 2025-11-29 06:39:17.38269354 +0000 UTC m=+0.112818088 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 06:39:17 compute-1 sudo[197446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eijkfwnokmeitcmgkhwslpltfuufuquq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398357.2419827-107-62304254639350/AnsiballZ_setup.py'
Nov 29 06:39:17 compute-1 sudo[197446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:17.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:17 compute-1 python3.9[197448]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:39:18 compute-1 ceph-mon[80754]: pgmap v805: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:18 compute-1 sudo[197446]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:18 compute-1 sudo[197530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqbbnbzlrrdinbgbvagjeutzmwrupjko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398357.2419827-107-62304254639350/AnsiballZ_dnf.py'
Nov 29 06:39:18 compute-1 sudo[197530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:18 compute-1 python3.9[197532]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:39:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:19.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:19 compute-1 ceph-mon[80754]: pgmap v806: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:19.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:20 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:21.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:21 compute-1 sudo[197534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:39:21 compute-1 sudo[197534]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:21 compute-1 sudo[197534]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:21 compute-1 sudo[197559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:39:21 compute-1 sudo[197559]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:39:21 compute-1 sudo[197559]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:21.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:21 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:39:21 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 2457 writes, 14K keys, 2457 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.03 MB/s
                                           Cumulative WAL: 2457 writes, 2457 syncs, 1.00 writes per sync, written: 0.03 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1119 writes, 4827 keys, 1119 commit groups, 1.0 writes per commit group, ingest: 11.96 MB, 0.02 MB/s
                                           Interval WAL: 1119 writes, 1119 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     45.8      0.37              0.06         5    0.074       0      0       0.0       0.0
                                             L6      1/0   10.12 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.2     76.8     65.0      0.58              0.13         4    0.146     17K   1772       0.0       0.0
                                            Sum      1/0   10.12 MB   0.0      0.0     0.0      0.0       0.1      0.0       0.0   3.2     47.0     57.5      0.96              0.18         9    0.106     17K   1772       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   4.2    105.3    110.1      0.23              0.09         4    0.057    9504   1038       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     76.8     65.0      0.58              0.13         4    0.146     17K   1772       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     64.6      0.26              0.06         4    0.066       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.017, interval 0.006
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.05 GB write, 0.05 MB/s write, 0.04 GB read, 0.04 MB/s read, 1.0 seconds
                                           Interval compaction: 0.02 GB write, 0.04 MB/s write, 0.02 GB read, 0.04 MB/s read, 0.2 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x562155f711f0#2 capacity: 304.00 MB usage: 1.50 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 4.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(66,1.32 MB,0.433641%) FilterBlock(9,58.36 KB,0.0187472%) IndexBlock(9,128.73 KB,0.0413543%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Nov 29 06:39:22 compute-1 ceph-mon[80754]: pgmap v807: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:39:22 compute-1 sshd-session[197584]: userauth_pubkey: signature algorithm ssh-rsa not in PubkeyAcceptedAlgorithms [preauth]
Nov 29 06:39:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:23.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:23 compute-1 podman[197586]: 2025-11-29 06:39:23.311154361 +0000 UTC m=+0.054133629 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 06:39:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:23.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:23 compute-1 ceph-mon[80754]: pgmap v808: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:24 compute-1 sudo[197530]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:24 compute-1 sudo[197754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kazhfatwylhyygdgpuolaxjcjlhfsxfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398364.4739938-143-159373078075267/AnsiballZ_stat.py'
Nov 29 06:39:24 compute-1 sudo[197754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:25.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:25 compute-1 python3.9[197756]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:39:25 compute-1 sudo[197754]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:25.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:25 compute-1 ceph-mon[80754]: pgmap v809: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:25 compute-1 sudo[197906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afkhrmrakmszsliraviuefijomhnezgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398365.4749937-173-66338795703533/AnsiballZ_command.py'
Nov 29 06:39:25 compute-1 sudo[197906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:26 compute-1 python3.9[197908]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:39:26 compute-1 sudo[197906]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:26 compute-1 sudo[198059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfmdswwruyeylfesliauawqdjeqvbebx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398366.5162008-203-144732406466075/AnsiballZ_stat.py'
Nov 29 06:39:26 compute-1 sudo[198059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:26 compute-1 python3.9[198061]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:39:27 compute-1 sudo[198059]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:27.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:27 compute-1 sudo[198211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqpkilxltkaftttijzkhzvawiemqiyjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398367.2159908-227-251173450489141/AnsiballZ_command.py'
Nov 29 06:39:27 compute-1 sudo[198211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:27 compute-1 python3.9[198213]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:39:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:27.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:27 compute-1 sudo[198211]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:27 compute-1 ceph-mon[80754]: pgmap v810: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:28 compute-1 sudo[198364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxaejzyumewjuwzjdjiytdywfkunhlgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398367.9246383-251-37782226970519/AnsiballZ_stat.py'
Nov 29 06:39:28 compute-1 sudo[198364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:28 compute-1 python3.9[198366]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:28 compute-1 sudo[198364]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.072784) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369072853, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1500, "num_deletes": 250, "total_data_size": 3710953, "memory_usage": 3755312, "flush_reason": "Manual Compaction"}
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Nov 29 06:39:29 compute-1 sudo[198487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwuplfzecyhhcxfwlfblkvueogdaeylc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398367.9246383-251-37782226970519/AnsiballZ_copy.py'
Nov 29 06:39:29 compute-1 sudo[198487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:29.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369129935, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1461897, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13025, "largest_seqno": 14520, "table_properties": {"data_size": 1457009, "index_size": 2284, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12208, "raw_average_key_size": 20, "raw_value_size": 1446470, "raw_average_value_size": 2402, "num_data_blocks": 103, "num_entries": 602, "num_filter_entries": 602, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398218, "oldest_key_time": 1764398218, "file_creation_time": 1764398369, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 57227 microseconds, and 5099 cpu microseconds.
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.130004) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1461897 bytes OK
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.130031) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.154395) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.154441) EVENT_LOG_v1 {"time_micros": 1764398369154431, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.154468) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 3704043, prev total WAL file size 3704043, number of live WAL files 2.
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.156436) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323533' seq:72057594037927935, type:22 .. '6D67727374617400353034' seq:0, type:0; will stop at (end)
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1427KB)], [24(10MB)]
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369156556, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 12071875, "oldest_snapshot_seqno": -1}
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4613 keys, 9170434 bytes, temperature: kUnknown
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369240842, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 9170434, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9137316, "index_size": 20464, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11589, "raw_key_size": 112549, "raw_average_key_size": 24, "raw_value_size": 9051697, "raw_average_value_size": 1962, "num_data_blocks": 883, "num_entries": 4613, "num_filter_entries": 4613, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398369, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.241424) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 9170434 bytes
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.243876) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.7 rd, 108.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 10.1 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(14.5) write-amplify(6.3) OK, records in: 5066, records dropped: 453 output_compression: NoCompression
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.243937) EVENT_LOG_v1 {"time_micros": 1764398369243924, "job": 12, "event": "compaction_finished", "compaction_time_micros": 84617, "compaction_time_cpu_micros": 42828, "output_level": 6, "num_output_files": 1, "total_output_size": 9170434, "num_input_records": 5066, "num_output_records": 4613, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369245165, "job": 12, "event": "table_file_deletion", "file_number": 26}
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369248659, "job": 12, "event": "table_file_deletion", "file_number": 24}
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.156219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.248940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.248948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.248961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.248965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.248970) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:39:29 compute-1 python3.9[198489]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398367.9246383-251-37782226970519/.source.iscsi _original_basename=.iv8wzz1q follow=False checksum=9eacfeea91ec496576ff4a4caacc5e836fc91499 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:29 compute-1 sudo[198487]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:29.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:29 compute-1 ceph-mon[80754]: pgmap v811: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:30 compute-1 sudo[198639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdlmfgojzxcedhosbzghhanmalophklr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398369.5049405-296-174468431717539/AnsiballZ_file.py'
Nov 29 06:39:30 compute-1 sudo[198639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:30 compute-1 python3.9[198641]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:30 compute-1 sudo[198639]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:30 compute-1 sudo[198791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooyyzlaqnmxkdzdbttuxupwnvkrvyulp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398370.4242995-320-251133755355557/AnsiballZ_lineinfile.py'
Nov 29 06:39:30 compute-1 sudo[198791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:31.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:31 compute-1 python3.9[198793]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:31 compute-1 sudo[198791]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:31.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:32 compute-1 ceph-mon[80754]: pgmap v812: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:32 compute-1 sudo[198943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccekuwhxkqpykbpxhisjmsegzgrpslqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398371.5249267-347-219610800305566/AnsiballZ_systemd_service.py'
Nov 29 06:39:32 compute-1 sudo[198943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:32 compute-1 sshd-session[197584]: Connection closed by authenticating user root 139.19.117.129 port 44242 [preauth]
Nov 29 06:39:32 compute-1 python3.9[198945]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:39:32 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 29 06:39:32 compute-1 sudo[198943]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:33.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:33 compute-1 ceph-mon[80754]: pgmap v813: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:33 compute-1 sudo[199099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygakesfltglifkybnxvahegjlpzvdpdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398372.8713865-371-173689089493412/AnsiballZ_systemd_service.py'
Nov 29 06:39:33 compute-1 sudo[199099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:33 compute-1 python3.9[199101]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:39:33 compute-1 systemd[1]: Reloading.
Nov 29 06:39:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:33.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:33 compute-1 systemd-sysv-generator[199128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:33 compute-1 systemd-rc-local-generator[199123]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:33 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 06:39:34 compute-1 systemd[1]: Starting Open-iSCSI...
Nov 29 06:39:34 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Nov 29 06:39:34 compute-1 systemd[1]: Started Open-iSCSI.
Nov 29 06:39:34 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 29 06:39:34 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 29 06:39:34 compute-1 sudo[199099]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:35.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:35 compute-1 sudo[199304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgldjlgplwavtpqlsvpvaypzwwiimcia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398375.1200762-404-208059582111797/AnsiballZ_service_facts.py'
Nov 29 06:39:35 compute-1 sudo[199304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:35 compute-1 python3.9[199307]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:39:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:35.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:35 compute-1 network[199325]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:39:35 compute-1 network[199326]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:39:35 compute-1 network[199327]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:39:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:36 compute-1 ceph-mon[80754]: pgmap v814: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:36 compute-1 sshd-session[199306]: Received disconnect from 93.157.248.178 port 47904:11: Bye Bye [preauth]
Nov 29 06:39:36 compute-1 sshd-session[199306]: Disconnected from authenticating user root 93.157.248.178 port 47904 [preauth]
Nov 29 06:39:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:37.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:37 compute-1 ceph-mon[80754]: pgmap v815: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:37.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:39.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:39 compute-1 sudo[199304]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:39.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:39 compute-1 ceph-mon[80754]: pgmap v816: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:40 compute-1 sudo[199597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qipyynmgagcjfcaziuyembrxwuiwmrmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398380.4376173-434-226488866674399/AnsiballZ_file.py'
Nov 29 06:39:40 compute-1 sudo[199597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:41 compute-1 python3.9[199599]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 06:39:41 compute-1 sudo[199597]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:41.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:41.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:41 compute-1 ceph-mon[80754]: pgmap v817: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:43.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:43.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:43 compute-1 ceph-mon[80754]: pgmap v818: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:45 compute-1 sshd-session[199140]: ssh_dispatch_run_fatal: Connection from 119.45.242.7 port 57008: Connection timed out [preauth]
Nov 29 06:39:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:45.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:45 compute-1 sudo[199749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sehazigcxoohmqofarvcoebnlhbjsyij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398384.6483183-458-125741359840043/AnsiballZ_modprobe.py'
Nov 29 06:39:45 compute-1 sudo[199749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:45.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:45 compute-1 python3.9[199751]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 29 06:39:45 compute-1 sudo[199749]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:45 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:39:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:45 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:39:46 compute-1 ceph-mon[80754]: pgmap v819: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:46 compute-1 sudo[199906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iinqjkzceepofvjldeqneynvcjnykskx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398386.3816695-482-211260985094385/AnsiballZ_stat.py'
Nov 29 06:39:46 compute-1 sudo[199906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:47 compute-1 python3.9[199908]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:47 compute-1 sudo[199906]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:47.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:47 compute-1 ceph-mon[80754]: pgmap v820: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:47 compute-1 sudo[200040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbbouvfiipivcsyhxispkexewyllrbxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398386.3816695-482-211260985094385/AnsiballZ_copy.py'
Nov 29 06:39:47 compute-1 sudo[200040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:47 compute-1 podman[200003]: 2025-11-29 06:39:47.649975526 +0000 UTC m=+0.117865423 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller)
Nov 29 06:39:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:47.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:47 compute-1 python3.9[200051]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398386.3816695-482-211260985094385/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:47 compute-1 sudo[200040]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:49.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:49 compute-1 ceph-mon[80754]: pgmap v821: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:39:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:49.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:39:49 compute-1 sudo[200208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shtklauaodsqxjyhandjgrvkypbiyutg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398389.381981-530-274481123280857/AnsiballZ_lineinfile.py'
Nov 29 06:39:49 compute-1 sudo[200208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:50 compute-1 python3.9[200210]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:50 compute-1 sudo[200208]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:51.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:51 compute-1 ceph-mon[80754]: pgmap v822: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:51.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:53.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:53 compute-1 ceph-mon[80754]: pgmap v823: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:53.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:54 compute-1 podman[200287]: 2025-11-29 06:39:54.327724221 +0000 UTC m=+0.066172369 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:39:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:39:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:55.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:39:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:55.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:39:56 compute-1 ceph-mon[80754]: pgmap v824: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:39:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:57.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:39:57 compute-1 sudo[200379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efcoisymljxrmujdsjksdrfyvfxorpun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398390.4110453-554-131527630622126/AnsiballZ_systemd.py'
Nov 29 06:39:57 compute-1 sudo[200379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:57 compute-1 ceph-mon[80754]: pgmap v825: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:39:57 compute-1 python3.9[200381]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:39:57 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 06:39:57 compute-1 systemd[1]: Stopped Load Kernel Modules.
Nov 29 06:39:57 compute-1 systemd[1]: Stopping Load Kernel Modules...
Nov 29 06:39:57 compute-1 systemd[1]: Starting Load Kernel Modules...
Nov 29 06:39:57 compute-1 systemd[1]: Finished Load Kernel Modules.
Nov 29 06:39:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:39:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:57.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:39:57 compute-1 sudo[200379]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:58 compute-1 sudo[200535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gijnxtdlbjluoqsayldmvgnvrjfmwgbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398398.4221408-578-164377460457211/AnsiballZ_file.py'
Nov 29 06:39:58 compute-1 sudo[200535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:58 compute-1 python3.9[200537]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:39:58 compute-1 sudo[200535]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:39:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:59.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:39:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:39:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:39:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:59.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:00 compute-1 ceph-mon[80754]: pgmap v826: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:00 compute-1 sudo[200687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucfcvoqtgonyaqigwyaavedmsxyjfczj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398399.8898315-605-233287468237418/AnsiballZ_stat.py'
Nov 29 06:40:00 compute-1 sudo[200687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:00 compute-1 python3.9[200689]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:00 compute-1 sudo[200687]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:01 compute-1 sudo[200839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nktrmzvrujkmdjkhrvckgjvxbwfmeatr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398400.7772145-632-21787421709172/AnsiballZ_stat.py'
Nov 29 06:40:01 compute-1 sudo[200839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:01.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:01 compute-1 ceph-mon[80754]: overall HEALTH_OK
Nov 29 06:40:01 compute-1 ceph-mon[80754]: pgmap v827: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:01 compute-1 python3.9[200841]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:01 compute-1 sudo[200839]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:01.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:02 compute-1 sudo[200991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjgqwspajngemjeyqsvjbvfwzyrmeljk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398401.512431-656-184756769076745/AnsiballZ_stat.py'
Nov 29 06:40:02 compute-1 sudo[200991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:02 compute-1 python3.9[200993]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:02 compute-1 sudo[200991]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:03.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:03 compute-1 sudo[201114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxsybklzgutgdgbvjuxfwigsnyfvgqgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398401.512431-656-184756769076745/AnsiballZ_copy.py'
Nov 29 06:40:03 compute-1 sudo[201114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:03 compute-1 python3.9[201116]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398401.512431-656-184756769076745/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:03 compute-1 sudo[201114]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:03.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:03 compute-1 ceph-mon[80754]: pgmap v828: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:04 compute-1 sudo[201266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ierarwayodjrsfxvxktmtmlbatchsqnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398403.8561947-702-138354120883593/AnsiballZ_command.py'
Nov 29 06:40:04 compute-1 sudo[201266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:04 compute-1 python3.9[201268]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:04 compute-1 sudo[201266]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:05 compute-1 sudo[201419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feuhdrvjjnfhiwjsdiwcmsusdipzzydg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398404.7252402-725-196552972000355/AnsiballZ_lineinfile.py'
Nov 29 06:40:05 compute-1 sudo[201419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:05.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:05 compute-1 python3.9[201421]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:05 compute-1 sudo[201419]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:05.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:06 compute-1 sudo[201571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dklrtampfxaywaewymnkgoljuxqdnsdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398405.4766865-749-217914890386665/AnsiballZ_replace.py'
Nov 29 06:40:06 compute-1 sudo[201571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:06 compute-1 python3.9[201573]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:06 compute-1 sudo[201571]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:06 compute-1 ceph-mon[80754]: pgmap v829: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:07.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:07 compute-1 sudo[201723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgobmzrutpblndnwmtlkklfjclyqvdmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398407.026109-773-90248212324386/AnsiballZ_replace.py'
Nov 29 06:40:07 compute-1 sudo[201723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:07 compute-1 python3.9[201725]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:07 compute-1 sudo[201723]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:07.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:08 compute-1 sudo[201875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stjpijksxwfrjrdyuoxgfdntwycrwckd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398407.959259-800-53666320251387/AnsiballZ_lineinfile.py'
Nov 29 06:40:08 compute-1 sudo[201875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:08 compute-1 python3.9[201877]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:08 compute-1 sudo[201875]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:08 compute-1 ceph-mon[80754]: pgmap v830: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:09 compute-1 sudo[202027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdelirzlhujoktiyrfnyosqyabzwrkyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398408.6957517-800-56734859122392/AnsiballZ_lineinfile.py'
Nov 29 06:40:09 compute-1 sudo[202027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:09.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:09 compute-1 python3.9[202029]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:09 compute-1 sudo[202027]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:09.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:10 compute-1 sudo[202179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohpnojnfytmgznvlwewvhhhgrvpphybw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398409.6008456-800-189987445649251/AnsiballZ_lineinfile.py'
Nov 29 06:40:10 compute-1 sudo[202179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:10 compute-1 ceph-mon[80754]: pgmap v831: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:10 compute-1 python3.9[202181]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:10 compute-1 sudo[202179]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:11 compute-1 sudo[202331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujhbcvcdvhiugdsofbembthkoesybkue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398410.617121-800-78065261541659/AnsiballZ_lineinfile.py'
Nov 29 06:40:11 compute-1 sudo[202331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:11.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:11 compute-1 python3.9[202333]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:11 compute-1 sudo[202331]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:11 compute-1 ceph-mon[80754]: pgmap v832: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:11.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:11 compute-1 sudo[202483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-turqaedqycnkeamumcxrbdndcpgcaidw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398411.5802152-887-177974865443742/AnsiballZ_stat.py'
Nov 29 06:40:11 compute-1 sudo[202483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:12 compute-1 python3.9[202485]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:12 compute-1 sudo[202483]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:12 compute-1 sudo[202637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmzdfwwgqhqbprmtfdllqjttdyshcvxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398412.369221-911-154561975131692/AnsiballZ_file.py'
Nov 29 06:40:12 compute-1 sudo[202637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:12 compute-1 python3.9[202639]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:12 compute-1 sudo[202637]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:13.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:13 compute-1 ceph-mon[80754]: pgmap v833: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:13.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:13 compute-1 sudo[202789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcfppsvqlhestahuqdmmrgtvddralzdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398413.512893-938-247722062596957/AnsiballZ_file.py'
Nov 29 06:40:13 compute-1 sudo[202789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:14 compute-1 python3.9[202791]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:14 compute-1 sudo[202789]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:14 compute-1 sudo[202941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrkrolmarhievdcddksbysjjzfudvplb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398414.416176-962-134599697057430/AnsiballZ_stat.py'
Nov 29 06:40:14 compute-1 sudo[202941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:14 compute-1 python3.9[202943]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:14 compute-1 sudo[202941]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:15.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:15 compute-1 sudo[203019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emnchcfynzvpwgdvxewdvzuuwolpidxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398414.416176-962-134599697057430/AnsiballZ_file.py'
Nov 29 06:40:15 compute-1 sudo[203019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:15 compute-1 python3.9[203021]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:15 compute-1 sudo[203019]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:15.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:15 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:40:15.907 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:40:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:40:15.909 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:40:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:40:15.909 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:40:15 compute-1 sudo[203171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulsmoihxwworgotthvmskooyhgqdsvnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398415.7091002-962-265368172280086/AnsiballZ_stat.py'
Nov 29 06:40:15 compute-1 sudo[203171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:16 compute-1 python3.9[203173]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:16 compute-1 sudo[203171]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:16 compute-1 ceph-mon[80754]: pgmap v834: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:16 compute-1 sudo[203249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mazdtxbzorjgskmowrvxabxahsgjlrgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398415.7091002-962-265368172280086/AnsiballZ_file.py'
Nov 29 06:40:16 compute-1 sudo[203249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:16 compute-1 python3.9[203251]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:16 compute-1 sudo[203249]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:17.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:17 compute-1 sudo[203401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrxymuqblkasthfnbsfppegwbgafkudj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398417.10552-1033-167173101105923/AnsiballZ_file.py'
Nov 29 06:40:17 compute-1 sudo[203401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:17 compute-1 python3.9[203403]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:17 compute-1 sudo[203401]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:17.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:17 compute-1 ceph-mon[80754]: pgmap v835: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:18 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 29 06:40:18 compute-1 podman[203478]: 2025-11-29 06:40:18.149962434 +0000 UTC m=+0.089294362 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:40:18 compute-1 sudo[203581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytjvrewemrivimgscdjytoqcgdnnwduc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398417.9558887-1055-179419266679848/AnsiballZ_stat.py'
Nov 29 06:40:18 compute-1 sudo[203581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:18 compute-1 python3.9[203583]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:18 compute-1 sudo[203581]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:18 compute-1 sudo[203659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfeazvuhfhjypvjdqdpmczzpkydpuibt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398417.9558887-1055-179419266679848/AnsiballZ_file.py'
Nov 29 06:40:18 compute-1 sudo[203659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:19 compute-1 python3.9[203661]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:19 compute-1 sudo[203659]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:19.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:19 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 06:40:19 compute-1 ceph-mon[80754]: pgmap v836: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:19 compute-1 sudo[203812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snzfykwwofnmpvcmbfalhaqphkegmgrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398419.2849472-1091-193843493028364/AnsiballZ_stat.py'
Nov 29 06:40:19 compute-1 sudo[203812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:19 compute-1 python3.9[203814]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:19 compute-1 sudo[203812]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:19.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:20 compute-1 sudo[203890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwiepogbykgxqdfmeouqythuuwvwmwym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398419.2849472-1091-193843493028364/AnsiballZ_file.py'
Nov 29 06:40:20 compute-1 sudo[203890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:20 compute-1 python3.9[203892]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:20 compute-1 sudo[203890]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:20 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:20 compute-1 sudo[204042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikbmeyatrmuksmanhcfdvtynrtufuvim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398420.5835257-1127-131844316017466/AnsiballZ_systemd.py'
Nov 29 06:40:20 compute-1 sudo[204042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:21.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:21 compute-1 python3.9[204044]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:21 compute-1 systemd[1]: Reloading.
Nov 29 06:40:21 compute-1 systemd-rc-local-generator[204070]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:40:21 compute-1 systemd-sysv-generator[204073]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:40:21 compute-1 sudo[204080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:21 compute-1 ceph-mon[80754]: pgmap v837: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:21 compute-1 sudo[204080]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:21 compute-1 sudo[204080]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:21 compute-1 sudo[204042]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:21 compute-1 sudo[204107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:40:21 compute-1 sudo[204107]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:21 compute-1 sudo[204107]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:21 compute-1 sudo[204132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:21 compute-1 sudo[204132]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:21 compute-1 sudo[204132]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:21 compute-1 sudo[204181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:40:21 compute-1 sudo[204181]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:21.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:22 compute-1 sudo[204181]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:22 compute-1 sudo[204363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alehdmmosjvuimegbnyzeirofbwhzpnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398422.121314-1151-132428763523731/AnsiballZ_stat.py'
Nov 29 06:40:22 compute-1 sudo[204363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:22 compute-1 python3.9[204365]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:22 compute-1 sudo[204363]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:23.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:23 compute-1 sudo[204441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhqkcixxiujncrzniifdbkjjkbkgkwfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398422.121314-1151-132428763523731/AnsiballZ_file.py'
Nov 29 06:40:23 compute-1 sudo[204441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:23 compute-1 python3.9[204443]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:40:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:40:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 06:40:23 compute-1 ceph-mon[80754]: pgmap v838: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 06:40:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:40:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:40:23 compute-1 sudo[204441]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:23.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:24 compute-1 sudo[204593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiysjvwhiyxxtqqzrqveulkdawdklcbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398423.7396142-1187-112732342809121/AnsiballZ_stat.py'
Nov 29 06:40:24 compute-1 sudo[204593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:24 compute-1 python3.9[204595]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:24 compute-1 sudo[204593]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.343467) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424343498, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 734, "num_deletes": 252, "total_data_size": 1398800, "memory_usage": 1425240, "flush_reason": "Manual Compaction"}
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424352568, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 924399, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14525, "largest_seqno": 15254, "table_properties": {"data_size": 920862, "index_size": 1381, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 6914, "raw_average_key_size": 16, "raw_value_size": 913841, "raw_average_value_size": 2191, "num_data_blocks": 63, "num_entries": 417, "num_filter_entries": 417, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398369, "oldest_key_time": 1764398369, "file_creation_time": 1764398424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 9149 microseconds, and 3043 cpu microseconds.
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.352616) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 924399 bytes OK
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.352635) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.353904) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.353917) EVENT_LOG_v1 {"time_micros": 1764398424353914, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.353932) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1394898, prev total WAL file size 1394898, number of live WAL files 2.
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.354612) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323533' seq:0, type:0; will stop at (end)
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(902KB)], [27(8955KB)]
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424354645, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 10094833, "oldest_snapshot_seqno": -1}
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4513 keys, 9525855 bytes, temperature: kUnknown
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424420959, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 9525855, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9492970, "index_size": 20487, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11333, "raw_key_size": 112200, "raw_average_key_size": 24, "raw_value_size": 9408606, "raw_average_value_size": 2084, "num_data_blocks": 864, "num_entries": 4513, "num_filter_entries": 4513, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.421263) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 9525855 bytes
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.423057) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.8 rd, 143.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 8.7 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(21.2) write-amplify(10.3) OK, records in: 5030, records dropped: 517 output_compression: NoCompression
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.423076) EVENT_LOG_v1 {"time_micros": 1764398424423068, "job": 14, "event": "compaction_finished", "compaction_time_micros": 66491, "compaction_time_cpu_micros": 18021, "output_level": 6, "num_output_files": 1, "total_output_size": 9525855, "num_input_records": 5030, "num_output_records": 4513, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424423609, "job": 14, "event": "table_file_deletion", "file_number": 29}
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424424997, "job": 14, "event": "table_file_deletion", "file_number": 27}
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.354502) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.425152) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.425158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.425159) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.425162) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.425164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:40:24 compute-1 sudo[204682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxchukmxvhtpsxkbnfzcaerrhmbxgxif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398423.7396142-1187-112732342809121/AnsiballZ_file.py'
Nov 29 06:40:24 compute-1 sudo[204682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:24 compute-1 podman[204646]: 2025-11-29 06:40:24.711274945 +0000 UTC m=+0.069641074 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 06:40:24 compute-1 python3.9[204691]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:24 compute-1 sudo[204682]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:25.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:40:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:40:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:40:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:40:25 compute-1 sudo[204844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iooagjkidgjlovuvylyteddfgkpkahzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398425.3783169-1224-30081345758873/AnsiballZ_systemd.py'
Nov 29 06:40:25 compute-1 sudo[204844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:25.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:26 compute-1 python3.9[204846]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:26 compute-1 systemd[1]: Reloading.
Nov 29 06:40:26 compute-1 systemd-rc-local-generator[204871]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:40:26 compute-1 systemd-sysv-generator[204874]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:40:26 compute-1 systemd[1]: Starting Create netns directory...
Nov 29 06:40:26 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:40:26 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:40:26 compute-1 systemd[1]: Finished Create netns directory.
Nov 29 06:40:26 compute-1 sudo[204844]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:26 compute-1 ceph-mon[80754]: pgmap v839: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:27.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:27 compute-1 sudo[205037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vweeybxvegunszrbpctcvuwavhoaxaxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398427.1878338-1253-4617814330079/AnsiballZ_file.py'
Nov 29 06:40:27 compute-1 sudo[205037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:27 compute-1 python3.9[205039]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:27 compute-1 sudo[205037]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:40:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:27.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:40:28 compute-1 ceph-mon[80754]: pgmap v840: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:28 compute-1 sudo[205189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnsdpycesdzarhxdhfugsefabupliyip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398428.0534494-1277-225374224493949/AnsiballZ_stat.py'
Nov 29 06:40:28 compute-1 sudo[205189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:28 compute-1 python3.9[205191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:28 compute-1 sudo[205189]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:29 compute-1 sudo[205312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwegtvtwftlxjpxrmqgiaswblmgcmpys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398428.0534494-1277-225374224493949/AnsiballZ_copy.py'
Nov 29 06:40:29 compute-1 sudo[205312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:29.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:29 compute-1 python3.9[205314]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398428.0534494-1277-225374224493949/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:29 compute-1 sudo[205312]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:29.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:30 compute-1 ceph-mon[80754]: pgmap v841: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:30 compute-1 sudo[205464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhqhwcogulbfdvkngeghmwxmsjqfozpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398430.3175578-1328-147523282297453/AnsiballZ_file.py'
Nov 29 06:40:30 compute-1 sudo[205464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:30 compute-1 python3.9[205466]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:30 compute-1 sudo[205464]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:31.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:31 compute-1 sudo[205616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxrliropfsatmhuganpvydhvhdnhqlzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398431.1533241-1352-221477454993418/AnsiballZ_stat.py'
Nov 29 06:40:31 compute-1 sudo[205616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:31 compute-1 python3.9[205618]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:31 compute-1 sudo[205616]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:31.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:31 compute-1 ceph-mon[80754]: pgmap v842: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:32 compute-1 sudo[205739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epuapoaztcwdawxuawdziwefhrdwdgvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398431.1533241-1352-221477454993418/AnsiballZ_copy.py'
Nov 29 06:40:32 compute-1 sudo[205739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:32 compute-1 python3.9[205741]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398431.1533241-1352-221477454993418/.source.json _original_basename=.ey95y30w follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:32 compute-1 sudo[205739]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:32 compute-1 sudo[205891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwyxeqqfvxkleoesmlertjzjyunzgwcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398432.5327055-1397-175553650819060/AnsiballZ_file.py'
Nov 29 06:40:32 compute-1 sudo[205891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:33 compute-1 python3.9[205893]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:33 compute-1 sudo[205891]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:33.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:33 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 29 06:40:33 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 29 06:40:33 compute-1 ceph-mon[80754]: pgmap v843: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:33 compute-1 sudo[206045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieoeubhiupdcrmraeybacuzelstxinyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398433.4030037-1421-211534030345512/AnsiballZ_stat.py'
Nov 29 06:40:33 compute-1 sudo[206045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:33.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:33 compute-1 sudo[206045]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:34 compute-1 sudo[206168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsohopbipffgsisrgrthjxevhnmqejox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398433.4030037-1421-211534030345512/AnsiballZ_copy.py'
Nov 29 06:40:34 compute-1 sudo[206168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:34 compute-1 sudo[206168]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:34 compute-1 sshd-session[204621]: error: kex_exchange_identification: read: Connection timed out
Nov 29 06:40:34 compute-1 sshd-session[204621]: banner exchange: Connection from 119.45.242.7 port 39558: Connection timed out
Nov 29 06:40:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:35.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:35 compute-1 ceph-mon[80754]: pgmap v844: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:35.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:36 compute-1 sudo[206247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:40:36 compute-1 sudo[206247]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:36 compute-1 sudo[206247]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:36 compute-1 sudo[206272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:40:36 compute-1 sudo[206272]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:40:36 compute-1 sudo[206272]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:37 compute-1 sudo[206370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elpuvapcjikriifiwpcplzvhaeqanfah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398435.658194-1472-58110249857486/AnsiballZ_container_config_data.py'
Nov 29 06:40:37 compute-1 sudo[206370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:37 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:40:37 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:40:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:37.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:37 compute-1 python3.9[206372]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 29 06:40:37 compute-1 sudo[206370]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:37.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:38 compute-1 ceph-mon[80754]: pgmap v845: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:38 compute-1 sudo[206522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mghkyvhbsxbsovallazpwuivydvpdaii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398437.7269936-1499-246093951330968/AnsiballZ_container_config_hash.py'
Nov 29 06:40:38 compute-1 sudo[206522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:38 compute-1 python3.9[206524]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:40:38 compute-1 sudo[206522]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:39.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:39 compute-1 ceph-mon[80754]: pgmap v846: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:39 compute-1 sudo[206674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zicobrsnbaqqikmosylhmyxfeparhjxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398438.925049-1526-72591237464978/AnsiballZ_podman_container_info.py'
Nov 29 06:40:39 compute-1 sudo[206674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:39 compute-1 python3.9[206676]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 06:40:39 compute-1 sudo[206674]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:39.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:41 compute-1 sshd-session[206728]: Invalid user myuser from 93.157.248.178 port 57088
Nov 29 06:40:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:41.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:41 compute-1 sshd-session[206728]: Received disconnect from 93.157.248.178 port 57088:11: Bye Bye [preauth]
Nov 29 06:40:41 compute-1 sshd-session[206728]: Disconnected from invalid user myuser 93.157.248.178 port 57088 [preauth]
Nov 29 06:40:41 compute-1 ceph-mon[80754]: pgmap v847: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:41 compute-1 sudo[206855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uujhcekekakzwxtcrvbtriznhvhyxrcr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398441.0864644-1565-155789449336721/AnsiballZ_edpm_container_manage.py'
Nov 29 06:40:41 compute-1 sudo[206855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:41.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:42 compute-1 python3[206858]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:40:42 compute-1 sshd-session[206857]: Invalid user ali from 71.70.164.48 port 42819
Nov 29 06:40:42 compute-1 sshd-session[206857]: Received disconnect from 71.70.164.48 port 42819:11: Bye Bye [preauth]
Nov 29 06:40:42 compute-1 sshd-session[206857]: Disconnected from invalid user ali 71.70.164.48 port 42819 [preauth]
Nov 29 06:40:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:43.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:43 compute-1 podman[206872]: 2025-11-29 06:40:43.280639091 +0000 UTC m=+1.134216846 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 06:40:43 compute-1 podman[206929]: 2025-11-29 06:40:43.464548999 +0000 UTC m=+0.065207751 container create 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 06:40:43 compute-1 podman[206929]: 2025-11-29 06:40:43.438142566 +0000 UTC m=+0.038801338 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 06:40:43 compute-1 python3[206858]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 06:40:43 compute-1 sudo[206855]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:43.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:44 compute-1 ceph-mon[80754]: pgmap v848: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:45.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:45 compute-1 ceph-mon[80754]: pgmap v849: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:45.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:46 compute-1 sudo[207118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eluszrfiaoahrazdwomahybdrjystplb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398446.125456-1589-66194723721885/AnsiballZ_stat.py'
Nov 29 06:40:46 compute-1 sudo[207118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:46 compute-1 python3.9[207120]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:46 compute-1 sudo[207118]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:47.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:47 compute-1 sudo[207272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkvadatrejjsyoxmnehmpdyywdkztqfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398447.0343926-1616-205547162250166/AnsiballZ_file.py'
Nov 29 06:40:47 compute-1 sudo[207272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:47 compute-1 python3.9[207274]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:47 compute-1 sudo[207272]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:47 compute-1 sudo[207348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdyvfingijqrgugwozqayskcrolefohh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398447.0343926-1616-205547162250166/AnsiballZ_stat.py'
Nov 29 06:40:47 compute-1 sudo[207348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:47.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:47 compute-1 ceph-mon[80754]: pgmap v850: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:47 compute-1 python3.9[207350]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:47 compute-1 sudo[207348]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:48 compute-1 podman[207403]: 2025-11-29 06:40:48.3801898 +0000 UTC m=+0.119409258 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 06:40:48 compute-1 sudo[207526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhfyimubolaxgpvigkxjgizbyarduytm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398448.067343-1616-179385087069921/AnsiballZ_copy.py'
Nov 29 06:40:48 compute-1 sudo[207526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:48 compute-1 python3.9[207528]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398448.067343-1616-179385087069921/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:48 compute-1 sudo[207526]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:49 compute-1 sudo[207602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aepjiuckqyvkefjjrxekscrcelqvsxje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398448.067343-1616-179385087069921/AnsiballZ_systemd.py'
Nov 29 06:40:49 compute-1 sudo[207602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:49.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:49 compute-1 python3.9[207604]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:40:49 compute-1 systemd[1]: Reloading.
Nov 29 06:40:49 compute-1 ceph-mon[80754]: pgmap v851: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:49 compute-1 systemd-rc-local-generator[207628]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:40:49 compute-1 systemd-sysv-generator[207635]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:40:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:49.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:49 compute-1 sudo[207602]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:50 compute-1 sudo[207715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shvhhzbeczaureuzgnechofnhdqsfjgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398448.067343-1616-179385087069921/AnsiballZ_systemd.py'
Nov 29 06:40:50 compute-1 sudo[207715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:50 compute-1 python3.9[207717]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:50 compute-1 systemd[1]: Reloading.
Nov 29 06:40:50 compute-1 systemd-sysv-generator[207753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:40:50 compute-1 systemd-rc-local-generator[207749]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:40:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:50 compute-1 systemd[1]: Starting multipathd container...
Nov 29 06:40:51 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:40:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03113e0c60b3891b666198972292ad578a22a07280e6f874c53ef5a9916475af/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:40:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03113e0c60b3891b666198972292ad578a22a07280e6f874c53ef5a9916475af/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:40:51 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834.
Nov 29 06:40:51 compute-1 podman[207758]: 2025-11-29 06:40:51.175233906 +0000 UTC m=+0.175082274 container init 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:40:51 compute-1 multipathd[207773]: + sudo -E kolla_set_configs
Nov 29 06:40:51 compute-1 podman[207758]: 2025-11-29 06:40:51.207461422 +0000 UTC m=+0.207309740 container start 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 06:40:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:51.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:51 compute-1 podman[207758]: multipathd
Nov 29 06:40:51 compute-1 systemd[1]: Started multipathd container.
Nov 29 06:40:51 compute-1 sudo[207780]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 29 06:40:51 compute-1 sudo[207780]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:40:51 compute-1 sudo[207780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:40:51 compute-1 sudo[207715]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:51 compute-1 multipathd[207773]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:40:51 compute-1 multipathd[207773]: INFO:__main__:Validating config file
Nov 29 06:40:51 compute-1 multipathd[207773]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:40:51 compute-1 multipathd[207773]: INFO:__main__:Writing out command to execute
Nov 29 06:40:51 compute-1 sudo[207780]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:51 compute-1 multipathd[207773]: ++ cat /run_command
Nov 29 06:40:51 compute-1 multipathd[207773]: + CMD='/usr/sbin/multipathd -d'
Nov 29 06:40:51 compute-1 multipathd[207773]: + ARGS=
Nov 29 06:40:51 compute-1 multipathd[207773]: + sudo kolla_copy_cacerts
Nov 29 06:40:51 compute-1 podman[207779]: 2025-11-29 06:40:51.307491431 +0000 UTC m=+0.078067420 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd)
Nov 29 06:40:51 compute-1 systemd[1]: 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-50362cb9e60ec73a.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:40:51 compute-1 systemd[1]: 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-50362cb9e60ec73a.service: Failed with result 'exit-code'.
Nov 29 06:40:51 compute-1 sudo[207808]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 29 06:40:51 compute-1 sudo[207808]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:40:51 compute-1 sudo[207808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:40:51 compute-1 sudo[207808]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:51 compute-1 multipathd[207773]: + [[ ! -n '' ]]
Nov 29 06:40:51 compute-1 multipathd[207773]: + . kolla_extend_start
Nov 29 06:40:51 compute-1 multipathd[207773]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 06:40:51 compute-1 multipathd[207773]: Running command: '/usr/sbin/multipathd -d'
Nov 29 06:40:51 compute-1 multipathd[207773]: + umask 0022
Nov 29 06:40:51 compute-1 multipathd[207773]: + exec /usr/sbin/multipathd -d
Nov 29 06:40:51 compute-1 ceph-mon[80754]: pgmap v852: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:51 compute-1 multipathd[207773]: 3915.815641 | --------start up--------
Nov 29 06:40:51 compute-1 multipathd[207773]: 3915.815663 | read /etc/multipath.conf
Nov 29 06:40:51 compute-1 multipathd[207773]: 3915.824181 | path checkers start up
Nov 29 06:40:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:51.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:40:52 compute-1 sshd-session[207641]: Invalid user gitlab from 66.94.122.234 port 60092
Nov 29 06:40:52 compute-1 sshd-session[207641]: Received disconnect from 66.94.122.234 port 60092:11: Bye Bye [preauth]
Nov 29 06:40:52 compute-1 sshd-session[207641]: Disconnected from invalid user gitlab 66.94.122.234 port 60092 [preauth]
Nov 29 06:40:52 compute-1 python3.9[207963]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:40:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:53.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:53 compute-1 sudo[208115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdnonphuulnjqozkpqkxbvofgqxqwbkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398453.2623184-1724-170253942993609/AnsiballZ_command.py'
Nov 29 06:40:53 compute-1 sudo[208115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:53 compute-1 python3.9[208117]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:53.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:53 compute-1 sudo[208115]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:54 compute-1 ceph-mon[80754]: pgmap v853: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:54 compute-1 sudo[208280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whzqebyxrzipydzxjmvsdoqntbacqgar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398454.2112694-1748-9511502876755/AnsiballZ_systemd.py'
Nov 29 06:40:54 compute-1 sudo[208280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:54 compute-1 python3.9[208282]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:40:54 compute-1 systemd[1]: Stopping multipathd container...
Nov 29 06:40:55 compute-1 podman[208284]: 2025-11-29 06:40:55.028055115 +0000 UTC m=+0.072452953 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:40:55 compute-1 multipathd[207773]: 3919.524102 | exit (signal)
Nov 29 06:40:55 compute-1 multipathd[207773]: 3919.524183 | --------shut down-------
Nov 29 06:40:55 compute-1 systemd[1]: libpod-28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834.scope: Deactivated successfully.
Nov 29 06:40:55 compute-1 podman[208293]: 2025-11-29 06:40:55.084501083 +0000 UTC m=+0.085399933 container died 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 29 06:40:55 compute-1 systemd[1]: 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-50362cb9e60ec73a.timer: Deactivated successfully.
Nov 29 06:40:55 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834.
Nov 29 06:40:55 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-userdata-shm.mount: Deactivated successfully.
Nov 29 06:40:55 compute-1 systemd[1]: var-lib-containers-storage-overlay-03113e0c60b3891b666198972292ad578a22a07280e6f874c53ef5a9916475af-merged.mount: Deactivated successfully.
Nov 29 06:40:55 compute-1 podman[208293]: 2025-11-29 06:40:55.140663113 +0000 UTC m=+0.141561903 container cleanup 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 29 06:40:55 compute-1 podman[208293]: multipathd
Nov 29 06:40:55 compute-1 podman[208337]: multipathd
Nov 29 06:40:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:55.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:55 compute-1 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 29 06:40:55 compute-1 systemd[1]: Stopped multipathd container.
Nov 29 06:40:55 compute-1 systemd[1]: Starting multipathd container...
Nov 29 06:40:55 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:40:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03113e0c60b3891b666198972292ad578a22a07280e6f874c53ef5a9916475af/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:40:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03113e0c60b3891b666198972292ad578a22a07280e6f874c53ef5a9916475af/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:40:55 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834.
Nov 29 06:40:55 compute-1 podman[208349]: 2025-11-29 06:40:55.371728041 +0000 UTC m=+0.125596599 container init 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:40:55 compute-1 multipathd[208365]: + sudo -E kolla_set_configs
Nov 29 06:40:55 compute-1 podman[208349]: 2025-11-29 06:40:55.40190367 +0000 UTC m=+0.155772198 container start 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:40:55 compute-1 sudo[208371]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 29 06:40:55 compute-1 sudo[208371]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:40:55 compute-1 sudo[208371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:40:55 compute-1 podman[208349]: multipathd
Nov 29 06:40:55 compute-1 systemd[1]: Started multipathd container.
Nov 29 06:40:55 compute-1 multipathd[208365]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:40:55 compute-1 multipathd[208365]: INFO:__main__:Validating config file
Nov 29 06:40:55 compute-1 multipathd[208365]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:40:55 compute-1 multipathd[208365]: INFO:__main__:Writing out command to execute
Nov 29 06:40:55 compute-1 sudo[208280]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:55 compute-1 sudo[208371]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:55 compute-1 multipathd[208365]: ++ cat /run_command
Nov 29 06:40:55 compute-1 multipathd[208365]: + CMD='/usr/sbin/multipathd -d'
Nov 29 06:40:55 compute-1 multipathd[208365]: + ARGS=
Nov 29 06:40:55 compute-1 multipathd[208365]: + sudo kolla_copy_cacerts
Nov 29 06:40:55 compute-1 sudo[208390]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 29 06:40:55 compute-1 sudo[208390]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:40:55 compute-1 sudo[208390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:40:55 compute-1 sudo[208390]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:55 compute-1 multipathd[208365]: + [[ ! -n '' ]]
Nov 29 06:40:55 compute-1 multipathd[208365]: + . kolla_extend_start
Nov 29 06:40:55 compute-1 multipathd[208365]: Running command: '/usr/sbin/multipathd -d'
Nov 29 06:40:55 compute-1 multipathd[208365]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 06:40:55 compute-1 multipathd[208365]: + umask 0022
Nov 29 06:40:55 compute-1 multipathd[208365]: + exec /usr/sbin/multipathd -d
Nov 29 06:40:55 compute-1 podman[208372]: 2025-11-29 06:40:55.484558005 +0000 UTC m=+0.068474233 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:40:55 compute-1 systemd[1]: 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-3a5a6556aad8b51f.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:40:55 compute-1 systemd[1]: 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-3a5a6556aad8b51f.service: Failed with result 'exit-code'.
Nov 29 06:40:55 compute-1 multipathd[208365]: 3919.974870 | --------start up--------
Nov 29 06:40:55 compute-1 multipathd[208365]: 3919.974902 | read /etc/multipath.conf
Nov 29 06:40:55 compute-1 multipathd[208365]: 3919.982838 | path checkers start up
Nov 29 06:40:55 compute-1 ceph-mon[80754]: pgmap v854: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:40:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:55.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:57.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:57.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:58 compute-1 ceph-mon[80754]: pgmap v855: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:40:58 compute-1 sudo[208552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdfmctixctbmzkgxxsektljzkvhphdlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398458.1815343-1773-169768832226035/AnsiballZ_file.py'
Nov 29 06:40:58 compute-1 sudo[208552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:58 compute-1 python3.9[208554]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:58 compute-1 sudo[208552]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:40:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:59.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:40:59 compute-1 sudo[208704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkurjuhytsicjmzuudpbgpdgwovkaspa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398459.4824848-1808-223202876490842/AnsiballZ_file.py'
Nov 29 06:40:59 compute-1 sudo[208704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:40:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:40:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:59.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:00 compute-1 python3.9[208706]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 06:41:00 compute-1 sudo[208704]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:00 compute-1 sudo[208856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urjwicpgxouwabsqordclynryguxsdnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398460.3142607-1832-11859765386018/AnsiballZ_modprobe.py'
Nov 29 06:41:00 compute-1 sudo[208856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:00 compute-1 python3.9[208858]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 29 06:41:00 compute-1 kernel: Key type psk registered
Nov 29 06:41:00 compute-1 sudo[208856]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:00 compute-1 ceph-mon[80754]: pgmap v856: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:01.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:01 compute-1 sudo[209017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnvltypwvyaqoofqleskqfcxuqvdnrhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398461.1928601-1856-281339305287973/AnsiballZ_stat.py'
Nov 29 06:41:01 compute-1 sudo[209017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:01 compute-1 python3.9[209019]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:41:01 compute-1 sudo[209017]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:01.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:02 compute-1 ceph-mon[80754]: pgmap v857: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:02 compute-1 sudo[209140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svwrmfuohykvornosxpeeupmjwymuldl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398461.1928601-1856-281339305287973/AnsiballZ_copy.py'
Nov 29 06:41:02 compute-1 sudo[209140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:02 compute-1 python3.9[209142]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398461.1928601-1856-281339305287973/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:02 compute-1 sudo[209140]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:03.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:03 compute-1 sudo[209292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vablnesfubvwarpmdlbdbhwuaikzxyig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398462.9005084-1904-166169940115974/AnsiballZ_lineinfile.py'
Nov 29 06:41:03 compute-1 sudo[209292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:03 compute-1 python3.9[209294]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:03 compute-1 sudo[209292]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:03.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:04 compute-1 ceph-mon[80754]: pgmap v858: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 4.8 KiB/s rd, 0 B/s wr, 7 op/s
Nov 29 06:41:04 compute-1 sudo[209444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayqgddtvomwjgqzrkljukwkixawfvhdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398463.762271-1928-97178524837428/AnsiballZ_systemd.py'
Nov 29 06:41:04 compute-1 sudo[209444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:04 compute-1 python3.9[209446]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:41:04 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 06:41:04 compute-1 systemd[1]: Stopped Load Kernel Modules.
Nov 29 06:41:04 compute-1 systemd[1]: Stopping Load Kernel Modules...
Nov 29 06:41:04 compute-1 systemd[1]: Starting Load Kernel Modules...
Nov 29 06:41:04 compute-1 systemd[1]: Finished Load Kernel Modules.
Nov 29 06:41:04 compute-1 sudo[209444]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:05.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:05 compute-1 sudo[209600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdhusjfxwkerzfecnbywtfvficqztdxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398464.9024057-1952-17115361182540/AnsiballZ_dnf.py'
Nov 29 06:41:05 compute-1 sudo[209600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:05 compute-1 python3.9[209602]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:41:05 compute-1 ceph-mon[80754]: pgmap v859: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 36 KiB/s rd, 0 B/s wr, 59 op/s
Nov 29 06:41:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:05.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:07.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:07 compute-1 ceph-mon[80754]: pgmap v860: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 80 KiB/s rd, 0 B/s wr, 132 op/s
Nov 29 06:41:07 compute-1 systemd[1]: Reloading.
Nov 29 06:41:07 compute-1 systemd-rc-local-generator[209639]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:07 compute-1 systemd-sysv-generator[209642]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:07.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:08 compute-1 systemd[1]: Reloading.
Nov 29 06:41:08 compute-1 systemd-sysv-generator[209672]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:08 compute-1 systemd-rc-local-generator[209667]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:08 compute-1 systemd-logind[785]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 06:41:08 compute-1 systemd-logind[785]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 06:41:08 compute-1 lvm[209715]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 06:41:08 compute-1 lvm[209715]: VG ceph_vg0 finished
Nov 29 06:41:08 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:41:08 compute-1 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:41:08 compute-1 systemd[1]: Reloading.
Nov 29 06:41:08 compute-1 systemd-sysv-generator[209770]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:08 compute-1 systemd-rc-local-generator[209766]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:09 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:41:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:09.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:09 compute-1 sudo[209600]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:09.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:10 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:41:10 compute-1 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:41:10 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.819s CPU time.
Nov 29 06:41:10 compute-1 systemd[1]: run-re191bf4652a44b02bd4ce636969ccf72.service: Deactivated successfully.
Nov 29 06:41:10 compute-1 ceph-mon[80754]: pgmap v861: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 178 op/s
Nov 29 06:41:10 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:11.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:11.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:12 compute-1 ceph-mon[80754]: pgmap v862: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 178 op/s
Nov 29 06:41:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:13.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:13 compute-1 ceph-mon[80754]: pgmap v863: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 107 KiB/s rd, 0 B/s wr, 178 op/s
Nov 29 06:41:13 compute-1 sudo[211054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qubynpiepdoqkmbxighernirtyeopsnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398473.3374581-1976-43431511140218/AnsiballZ_systemd_service.py'
Nov 29 06:41:13 compute-1 sudo[211054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:13.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:13 compute-1 python3.9[211056]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:41:14 compute-1 systemd[1]: Stopping Open-iSCSI...
Nov 29 06:41:14 compute-1 iscsid[199143]: iscsid shutting down.
Nov 29 06:41:14 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Nov 29 06:41:14 compute-1 systemd[1]: Stopped Open-iSCSI.
Nov 29 06:41:14 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 06:41:14 compute-1 systemd[1]: Starting Open-iSCSI...
Nov 29 06:41:14 compute-1 systemd[1]: Started Open-iSCSI.
Nov 29 06:41:14 compute-1 sudo[211054]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:15 compute-1 python3.9[211211]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:41:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:15.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:41:15.908 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:41:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:41:15.910 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:41:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:41:15.910 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:41:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:15.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:15 compute-1 sudo[211365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eimpuggtssxrsofdqxfqneuzjnbklrfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398475.611736-2028-123123797776273/AnsiballZ_file.py'
Nov 29 06:41:15 compute-1 sudo[211365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:15 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:16 compute-1 python3.9[211367]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:16 compute-1 sudo[211365]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:16 compute-1 ceph-mon[80754]: pgmap v864: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 102 KiB/s rd, 0 B/s wr, 170 op/s
Nov 29 06:41:17 compute-1 sudo[211517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ashtzdydcxsqvnpndmpcvuuxwpdshutr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398476.8341546-2062-33645154956508/AnsiballZ_systemd_service.py'
Nov 29 06:41:17 compute-1 sudo[211517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:17.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:17 compute-1 python3.9[211519]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:41:17 compute-1 systemd[1]: Reloading.
Nov 29 06:41:17 compute-1 ceph-mon[80754]: pgmap v865: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 71 KiB/s rd, 0 B/s wr, 119 op/s
Nov 29 06:41:17 compute-1 systemd-rc-local-generator[211544]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:17 compute-1 systemd-sysv-generator[211548]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:17 compute-1 sudo[211517]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:17.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:18 compute-1 podman[211678]: 2025-11-29 06:41:18.539139976 +0000 UTC m=+0.100078031 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 06:41:18 compute-1 python3.9[211721]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:41:18 compute-1 network[211748]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:41:18 compute-1 network[211749]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:41:18 compute-1 network[211750]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:41:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:19.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:19.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:20 compute-1 ceph-mon[80754]: pgmap v866: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 27 KiB/s rd, 0 B/s wr, 45 op/s
Nov 29 06:41:20 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:21.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:21.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:21 compute-1 ceph-mon[80754]: pgmap v867: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:23.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:23 compute-1 sudo[212023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-labfjnpiuylbywdrffndvnyjvdqvcjwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398483.1755984-2118-153670294335974/AnsiballZ_systemd_service.py'
Nov 29 06:41:23 compute-1 sudo[212023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:23.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:23 compute-1 python3.9[212025]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:23 compute-1 sudo[212023]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:24 compute-1 ceph-mon[80754]: pgmap v868: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:24 compute-1 sudo[212176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iexdsvgasyztlitnvwezowlchuywklrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398484.14639-2118-50126686249273/AnsiballZ_systemd_service.py'
Nov 29 06:41:24 compute-1 sudo[212176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:24 compute-1 python3.9[212178]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:24 compute-1 sudo[212176]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:25.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:25 compute-1 podman[212300]: 2025-11-29 06:41:25.33350669 +0000 UTC m=+0.058173356 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:41:25 compute-1 sudo[212346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmwvxyvlnubqapchnzhpxgncgzqiicmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398485.0072474-2118-207594677693451/AnsiballZ_systemd_service.py'
Nov 29 06:41:25 compute-1 sudo[212346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:25 compute-1 python3.9[212350]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:25 compute-1 sudo[212346]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:25 compute-1 ceph-mon[80754]: pgmap v869: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:25 compute-1 podman[212352]: 2025-11-29 06:41:25.741707739 +0000 UTC m=+0.089142287 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd)
Nov 29 06:41:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:25.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:26 compute-1 sudo[212522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmsansmcnscixhqequmsqeqzpudhgyfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398485.7982185-2118-98748390061700/AnsiballZ_systemd_service.py'
Nov 29 06:41:26 compute-1 sudo[212522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:26 compute-1 python3.9[212524]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:26 compute-1 sudo[212522]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:26 compute-1 sudo[212675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqvfsotrqqzlezgdcqkcharmbrkkpmja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398486.6325834-2118-239170043576394/AnsiballZ_systemd_service.py'
Nov 29 06:41:26 compute-1 sudo[212675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:27 compute-1 python3.9[212677]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:27.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:27 compute-1 sudo[212675]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:27 compute-1 sudo[212828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdphetofyioflkklshmsppjitcfdcidz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398487.420328-2118-13863026641421/AnsiballZ_systemd_service.py'
Nov 29 06:41:27 compute-1 sudo[212828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:27.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:28 compute-1 python3.9[212830]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:28 compute-1 sudo[212828]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:28 compute-1 ceph-mon[80754]: pgmap v870: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:28 compute-1 sudo[212981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivhmlmbqgqlpduzszjwccivkntqiofdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398488.269108-2118-125267847040842/AnsiballZ_systemd_service.py'
Nov 29 06:41:28 compute-1 sudo[212981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:28 compute-1 python3.9[212983]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:28 compute-1 sudo[212981]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:29.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:29 compute-1 sudo[213134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjvajanibcxnzkoqvkozelowfhldmzdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398489.0843494-2118-138870784278634/AnsiballZ_systemd_service.py'
Nov 29 06:41:29 compute-1 sudo[213134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:29 compute-1 python3.9[213136]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:29 compute-1 sudo[213134]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:41:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:29.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:41:30 compute-1 sudo[213287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibhprzlabdlbdvlttlykpqtlobwxdxyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398490.2923522-2295-227991881444327/AnsiballZ_file.py'
Nov 29 06:41:30 compute-1 sudo[213287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:30 compute-1 ceph-mon[80754]: pgmap v871: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:30 compute-1 python3.9[213289]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:30 compute-1 sudo[213287]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:30 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:31.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:31 compute-1 sudo[213439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apwtbtcuagsbobrdhxqzorgydudutotr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398490.9748805-2295-137490901488582/AnsiballZ_file.py'
Nov 29 06:41:31 compute-1 sudo[213439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:31 compute-1 python3.9[213441]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:31 compute-1 sudo[213439]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:31.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:32 compute-1 sudo[213591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqcqaaszcvcbqhddttrxyiupfprsxctk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398491.7327802-2295-193796445049117/AnsiballZ_file.py'
Nov 29 06:41:32 compute-1 sudo[213591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:32 compute-1 ceph-mon[80754]: pgmap v872: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:32 compute-1 python3.9[213593]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:32 compute-1 sudo[213591]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:32 compute-1 sudo[213743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abmdjjoxwkuyewbiikwskbgostjqxght ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398492.45012-2295-90083581894412/AnsiballZ_file.py'
Nov 29 06:41:32 compute-1 sudo[213743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:32 compute-1 python3.9[213745]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:32 compute-1 sudo[213743]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000055s ======
Nov 29 06:41:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:33.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Nov 29 06:41:33 compute-1 ceph-mon[80754]: pgmap v873: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:33 compute-1 sudo[213895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snayzunfavqvmailqiweemlbyhqmycqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398493.1104114-2295-50670322950292/AnsiballZ_file.py'
Nov 29 06:41:33 compute-1 sudo[213895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:33 compute-1 python3.9[213897]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:33 compute-1 sudo[213895]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:33.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:34 compute-1 sudo[214047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bypgcesoqsspyrwsdjnzdhjtxksdpdnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398493.7990284-2295-192852564571878/AnsiballZ_file.py'
Nov 29 06:41:34 compute-1 sudo[214047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:34 compute-1 python3.9[214049]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:34 compute-1 sudo[214047]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:34 compute-1 sudo[214199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubxdokvzvyfzcgxzemkqvqbdwwpbggyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398494.4521813-2295-142102466025215/AnsiballZ_file.py'
Nov 29 06:41:34 compute-1 sudo[214199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:34 compute-1 python3.9[214201]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:34 compute-1 sudo[214199]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:35.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:35 compute-1 sudo[214351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkfmlydbqqjvrqarneeazmnbipuspyls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398495.1350448-2295-29490805232159/AnsiballZ_file.py'
Nov 29 06:41:35 compute-1 sudo[214351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:35 compute-1 ceph-mon[80754]: pgmap v874: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:35 compute-1 python3.9[214353]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:35 compute-1 sudo[214351]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:35.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:35 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:36 compute-1 sudo[214378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:36 compute-1 sudo[214378]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:36 compute-1 sudo[214378]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:36 compute-1 sudo[214403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:41:36 compute-1 sudo[214403]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:36 compute-1 sudo[214403]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:36 compute-1 sudo[214428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:36 compute-1 sudo[214428]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:36 compute-1 sudo[214428]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:36 compute-1 sudo[214453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:41:36 compute-1 sudo[214453]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:37 compute-1 sudo[214453]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:37 compute-1 sudo[214634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnjkourlabkgpdeilfhnyiyhomcuoltc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398496.8777077-2466-146398432401718/AnsiballZ_file.py'
Nov 29 06:41:37 compute-1 sudo[214634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:37.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:37 compute-1 python3.9[214636]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:37 compute-1 sudo[214634]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:37 compute-1 ceph-mon[80754]: pgmap v875: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:37 compute-1 sudo[214786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzphcyvawyukzrhjsrrdefkqzehoexpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398497.5347228-2466-34062322267413/AnsiballZ_file.py'
Nov 29 06:41:37 compute-1 sudo[214786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:37.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:38 compute-1 python3.9[214788]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:38 compute-1 sudo[214786]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:38 compute-1 sudo[214938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gammtxbkjhnbhuonvvdkvvxjdhqvbyoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398498.2224717-2466-114261282289234/AnsiballZ_file.py'
Nov 29 06:41:38 compute-1 sudo[214938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:38 compute-1 python3.9[214940]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:38 compute-1 sudo[214938]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:39 compute-1 sudo[215090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syavoxsfeqjnbmabgpgxztktnytanqxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398498.8919683-2466-41665980178020/AnsiballZ_file.py'
Nov 29 06:41:39 compute-1 sudo[215090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:39.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:41:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:41:39 compute-1 ceph-mon[80754]: pgmap v876: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 06:41:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:41:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:41:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:41:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:41:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:41:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:41:39 compute-1 python3.9[215092]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:39 compute-1 sudo[215090]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:39 compute-1 sudo[215242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exjdkxwiakgfuehpmlgyevdlmzerujxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398499.5922728-2466-87308894669045/AnsiballZ_file.py'
Nov 29 06:41:39 compute-1 sudo[215242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:39.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:40 compute-1 python3.9[215244]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:40 compute-1 sudo[215242]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:40 compute-1 sudo[215394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hltxngxkuczsxcvyoujdqipdhzncawxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398500.4609032-2466-140823743622330/AnsiballZ_file.py'
Nov 29 06:41:40 compute-1 sudo[215394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:40 compute-1 python3.9[215396]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:40 compute-1 sudo[215394]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:41 compute-1 ceph-mon[80754]: pgmap v877: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:41.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:41 compute-1 sudo[215546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqzvwfzdmcywxnwiyqkprmylsxomnqfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398501.0895746-2466-234353479289021/AnsiballZ_file.py'
Nov 29 06:41:41 compute-1 sudo[215546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:41 compute-1 python3.9[215548]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:41 compute-1 sudo[215546]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:41.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:42 compute-1 sudo[215698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odwjdpjnhmizkcaroyaslyzvdylcvsxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398501.8398585-2466-14637485123428/AnsiballZ_file.py'
Nov 29 06:41:42 compute-1 sudo[215698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:42 compute-1 python3.9[215700]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:42 compute-1 sudo[215698]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:43.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:43 compute-1 sudo[215850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faoiuqmnsvqozulmpnnjiekybpdtwvki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398503.071722-2641-239725337109660/AnsiballZ_command.py'
Nov 29 06:41:43 compute-1 sudo[215850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:43 compute-1 python3.9[215852]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:43 compute-1 sudo[215850]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:43 compute-1 ceph-mon[80754]: pgmap v878: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:43.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:44 compute-1 python3.9[216004]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:41:45 compute-1 sudo[216154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofdsvspdulpaxjxejrudngcoljnlprxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398504.9150286-2694-187990897029088/AnsiballZ_systemd_service.py'
Nov 29 06:41:45 compute-1 sudo[216154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:45.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:45 compute-1 python3.9[216156]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:41:45 compute-1 systemd[1]: Reloading.
Nov 29 06:41:45 compute-1 systemd-rc-local-generator[216183]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:45 compute-1 systemd-sysv-generator[216186]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:45 compute-1 ceph-mon[80754]: pgmap v879: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:45 compute-1 sudo[216154]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:45.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:46 compute-1 sudo[216341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxrkrrdkkqkfvzerddgqoekxqfywaslb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398506.229957-2718-250655597250007/AnsiballZ_command.py'
Nov 29 06:41:46 compute-1 sudo[216341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:46 compute-1 sudo[216344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:41:46 compute-1 sudo[216344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:46 compute-1 sudo[216344]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:46 compute-1 python3.9[216343]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:46 compute-1 sudo[216341]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:46 compute-1 sudo[216369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:41:46 compute-1 sudo[216369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:41:46 compute-1 sudo[216369]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:47 compute-1 sudo[216546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jusrkujxotmeyslupkntrbxdnlpgfgms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398506.884861-2718-259540155747923/AnsiballZ_command.py'
Nov 29 06:41:47 compute-1 sudo[216546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:47.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:47 compute-1 python3.9[216548]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:47 compute-1 sudo[216546]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:47 compute-1 sshd-session[216376]: Invalid user jmarquez from 93.157.248.178 port 34730
Nov 29 06:41:47 compute-1 sshd-session[216376]: Received disconnect from 93.157.248.178 port 34730:11: Bye Bye [preauth]
Nov 29 06:41:47 compute-1 sshd-session[216376]: Disconnected from invalid user jmarquez 93.157.248.178 port 34730 [preauth]
Nov 29 06:41:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:41:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:41:47 compute-1 ceph-mon[80754]: pgmap v880: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:47 compute-1 sudo[216699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owtlrxgtlbwwtvnxwiwlswcrfucgtjjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398507.5166223-2718-50219909149528/AnsiballZ_command.py'
Nov 29 06:41:47 compute-1 sudo[216699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:47.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:48 compute-1 python3.9[216701]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:48 compute-1 sudo[216699]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:48 compute-1 sudo[216852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsnmzlzcwtfzucmgqplsrlorbuhihobs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398508.215306-2718-278384356383806/AnsiballZ_command.py'
Nov 29 06:41:48 compute-1 sudo[216852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:48 compute-1 python3.9[216854]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:48 compute-1 sudo[216852]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:48 compute-1 podman[216856]: 2025-11-29 06:41:48.911129669 +0000 UTC m=+0.158417921 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:41:49 compute-1 sudo[217031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqdmpscotuaeegzesutgznumrpojoiby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398508.9365783-2718-78264950171502/AnsiballZ_command.py'
Nov 29 06:41:49 compute-1 sudo[217031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:49.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:49 compute-1 python3.9[217033]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:49 compute-1 sudo[217031]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:49 compute-1 sudo[217184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bocmksmrysetwyecbbpypjvxroefznlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398509.5788996-2718-268369565933044/AnsiballZ_command.py'
Nov 29 06:41:49 compute-1 sudo[217184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:49.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:50 compute-1 python3.9[217186]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:50 compute-1 sudo[217184]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:50 compute-1 ceph-mon[80754]: pgmap v881: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:50 compute-1 sudo[217337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fccesqbxppxnqcjakfsnuxghxnreubrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398510.2533383-2718-112320761392633/AnsiballZ_command.py'
Nov 29 06:41:50 compute-1 sudo[217337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:50 compute-1 python3.9[217339]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:50 compute-1 sudo[217337]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:51 compute-1 sudo[217490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-focxnhqreugojmyqiekhvrdxkitmbmns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398510.9517179-2718-130089682900633/AnsiballZ_command.py'
Nov 29 06:41:51 compute-1 sudo[217490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:51.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:51 compute-1 python3.9[217492]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:51 compute-1 sudo[217490]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:51 compute-1 ceph-mon[80754]: pgmap v882: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:51.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:53.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:53 compute-1 ceph-mon[80754]: pgmap v883: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:53.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:54 compute-1 sudo[217643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovgvqzveiiouuvnubqwweginqwyfcgzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398514.174297-2925-33848283400974/AnsiballZ_file.py'
Nov 29 06:41:54 compute-1 sudo[217643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:54 compute-1 python3.9[217645]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:54 compute-1 sudo[217643]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:55 compute-1 sudo[217795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmqdcmunddjgegkxmgoiypmdzshdckkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398514.905791-2925-277533543862958/AnsiballZ_file.py'
Nov 29 06:41:55 compute-1 sudo[217795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:55.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:55 compute-1 python3.9[217797]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:55 compute-1 sudo[217795]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:55 compute-1 podman[217798]: 2025-11-29 06:41:55.544719489 +0000 UTC m=+0.065554442 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 06:41:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:55.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:55 compute-1 sudo[217981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsysivlatymyekxfrjrcvsunqyescvij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398515.6417658-2925-222287353414574/AnsiballZ_file.py'
Nov 29 06:41:55 compute-1 sudo[217981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:41:56 compute-1 podman[217940]: 2025-11-29 06:41:56.005530888 +0000 UTC m=+0.079924411 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 06:41:56 compute-1 python3.9[217988]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:56 compute-1 sudo[217981]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:56 compute-1 ceph-mon[80754]: pgmap v884: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:56 compute-1 sudo[218138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efrvgigllofkzjuvtiwjudghcajxttoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398516.4339328-2991-67815201904772/AnsiballZ_file.py'
Nov 29 06:41:56 compute-1 sudo[218138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:56 compute-1 python3.9[218140]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:56 compute-1 sudo[218138]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:57.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:57 compute-1 sudo[218290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilurantdtdcbpmxibhfkiumzpsnndajx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398517.0850313-2991-49076930398000/AnsiballZ_file.py'
Nov 29 06:41:57 compute-1 sudo[218290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:57 compute-1 ceph-mon[80754]: pgmap v885: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:57 compute-1 python3.9[218292]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:57 compute-1 sudo[218290]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:41:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:57.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:41:58 compute-1 sudo[218442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eghawlpqxhmqebjgvfhqntiaeihkxmkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398517.7955673-2991-239295743727208/AnsiballZ_file.py'
Nov 29 06:41:58 compute-1 sudo[218442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:58 compute-1 python3.9[218444]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:58 compute-1 sudo[218442]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:58 compute-1 sudo[218594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvbxgxkovydrsowbxsgagxspexeinset ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398518.5053-2991-57063615897622/AnsiballZ_file.py'
Nov 29 06:41:58 compute-1 sudo[218594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:58 compute-1 python3.9[218596]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:58 compute-1 sudo[218594]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:59.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:41:59 compute-1 sudo[218746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcnydjajrlebnauuunckemdkttpgcwmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398519.14045-2991-90899434882285/AnsiballZ_file.py'
Nov 29 06:41:59 compute-1 sudo[218746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:59 compute-1 python3.9[218748]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:59 compute-1 sudo[218746]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:59 compute-1 ceph-mon[80754]: pgmap v886: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:41:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:41:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:41:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:59.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:00 compute-1 sudo[218898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laesukzgjzosoqdnafachljegbuitycf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398519.7754157-2991-132738657212282/AnsiballZ_file.py'
Nov 29 06:42:00 compute-1 sudo[218898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:00 compute-1 python3.9[218900]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:00 compute-1 sudo[218898]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:00 compute-1 sudo[219050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgxjwjisapoaawhnjrwckfnuuorhrgbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398520.405535-2991-267250739903454/AnsiballZ_file.py'
Nov 29 06:42:00 compute-1 sudo[219050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:00 compute-1 python3.9[219052]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:00 compute-1 sudo[219050]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:01.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:01 compute-1 ceph-mon[80754]: pgmap v887: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:01.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:03.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:03 compute-1 ceph-mon[80754]: pgmap v888: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:03.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:05.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:05 compute-1 ceph-mon[80754]: pgmap v889: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:05.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:07.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:07 compute-1 ceph-mon[80754]: pgmap v890: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:08.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:09.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:09 compute-1 ceph-mon[80754]: pgmap v891: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:10.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:11 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:11.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:11 compute-1 ceph-mon[80754]: pgmap v892: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:12.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:13.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:14.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:14 compute-1 ceph-mon[80754]: pgmap v893: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:14 compute-1 sudo[219202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyzyneanrzhdsksghcucchcmqdcbbnaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398534.3021827-3316-54927987493139/AnsiballZ_getent.py'
Nov 29 06:42:14 compute-1 sudo[219202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:14 compute-1 python3.9[219204]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 29 06:42:14 compute-1 sudo[219202]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:15.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:15 compute-1 sudo[219355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcxcnrobuscozlmwleejznvaxkadtxxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398535.2454476-3340-103289865432775/AnsiballZ_group.py'
Nov 29 06:42:15 compute-1 sudo[219355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:15 compute-1 python3.9[219357]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:42:15 compute-1 groupadd[219358]: group added to /etc/group: name=nova, GID=42436
Nov 29 06:42:15 compute-1 groupadd[219358]: group added to /etc/gshadow: name=nova
Nov 29 06:42:15 compute-1 groupadd[219358]: new group: name=nova, GID=42436
Nov 29 06:42:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:42:15.910 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:42:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:42:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:42:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:42:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:42:15 compute-1 sudo[219355]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:16.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:16 compute-1 ceph-mon[80754]: pgmap v894: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:16 compute-1 sudo[219513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnqrhykyriefjaeydfjynjamfhncdskv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398536.21341-3364-7893198298661/AnsiballZ_user.py'
Nov 29 06:42:16 compute-1 sudo[219513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:16 compute-1 python3.9[219515]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:42:16 compute-1 useradd[219517]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 29 06:42:16 compute-1 useradd[219517]: add 'nova' to group 'libvirt'
Nov 29 06:42:16 compute-1 useradd[219517]: add 'nova' to shadow group 'libvirt'
Nov 29 06:42:17 compute-1 sudo[219513]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:17 compute-1 ceph-mon[80754]: pgmap v895: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:17.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:18.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:18 compute-1 sshd-session[219548]: Accepted publickey for zuul from 192.168.122.30 port 53484 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:42:18 compute-1 systemd-logind[785]: New session 50 of user zuul.
Nov 29 06:42:18 compute-1 systemd[1]: Started Session 50 of User zuul.
Nov 29 06:42:18 compute-1 sshd-session[219548]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:42:18 compute-1 sshd-session[219551]: Received disconnect from 192.168.122.30 port 53484:11: disconnected by user
Nov 29 06:42:18 compute-1 sshd-session[219551]: Disconnected from user zuul 192.168.122.30 port 53484
Nov 29 06:42:18 compute-1 sshd-session[219548]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:42:18 compute-1 systemd[1]: session-50.scope: Deactivated successfully.
Nov 29 06:42:18 compute-1 systemd-logind[785]: Session 50 logged out. Waiting for processes to exit.
Nov 29 06:42:18 compute-1 systemd-logind[785]: Removed session 50.
Nov 29 06:42:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:19.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:19 compute-1 podman[219645]: 2025-11-29 06:42:19.367500633 +0000 UTC m=+0.110348870 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 06:42:19 compute-1 python3.9[219727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:19 compute-1 ceph-mon[80754]: pgmap v896: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:20.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:20 compute-1 python3.9[219849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398539.0266728-3439-270502854871527/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:20 compute-1 python3.9[219999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:21 compute-1 ceph-mon[80754]: pgmap v897: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:21 compute-1 python3.9[220075]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:21.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:22.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:22 compute-1 python3.9[220225]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:22 compute-1 python3.9[220346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398541.406909-3439-55184346923556/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:23.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:23 compute-1 python3.9[220496]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:23 compute-1 ceph-mon[80754]: pgmap v898: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:24 compute-1 python3.9[220617]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398542.9289002-3439-204447158189704/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:24.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:24 compute-1 python3.9[220767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:25 compute-1 python3.9[220888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398544.182562-3439-205740641074297/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:25.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:25 compute-1 podman[221012]: 2025-11-29 06:42:25.743266266 +0000 UTC m=+0.062418042 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 06:42:25 compute-1 python3.9[221051]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:26.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:26 compute-1 ceph-mon[80754]: pgmap v899: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:26 compute-1 podman[221150]: 2025-11-29 06:42:26.313490225 +0000 UTC m=+0.054339068 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 29 06:42:26 compute-1 python3.9[221192]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398545.4034517-3439-196084505826102/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:27.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:27 compute-1 sudo[221346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzeevwchekmldecikahemmemujhpteat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398547.0798697-3688-7921925560181/AnsiballZ_file.py'
Nov 29 06:42:27 compute-1 sudo[221346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:27 compute-1 python3.9[221348]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:27 compute-1 sudo[221346]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:28.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:28 compute-1 ceph-mon[80754]: pgmap v900: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:28 compute-1 sudo[221498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lizuydmuhzebushalcovcdvesymitnlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398547.8806162-3712-163517899234588/AnsiballZ_copy.py'
Nov 29 06:42:28 compute-1 sudo[221498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:28 compute-1 python3.9[221500]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:28 compute-1 sudo[221498]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:29 compute-1 sudo[221650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcfbtnqjtcwsscbmztibvvezreerchde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398548.6473668-3736-243958459006138/AnsiballZ_stat.py'
Nov 29 06:42:29 compute-1 sudo[221650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:29 compute-1 python3.9[221652]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:29 compute-1 sudo[221650]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:29.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:29 compute-1 ceph-mon[80754]: pgmap v901: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:29 compute-1 sudo[221802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huehedspxlmdowyxhcckxngxslyctogu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398549.4923234-3761-199268457231891/AnsiballZ_stat.py'
Nov 29 06:42:29 compute-1 sudo[221802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:30 compute-1 python3.9[221804]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:30 compute-1 sudo[221802]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:30.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:30 compute-1 sudo[221925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tczlcymgjzznenjrhmzdrpahmeaxjqbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398549.4923234-3761-199268457231891/AnsiballZ_copy.py'
Nov 29 06:42:30 compute-1 sudo[221925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:30 compute-1 python3.9[221927]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764398549.4923234-3761-199268457231891/.source _original_basename=.5kp900w0 follow=False checksum=4f791796328ccd9f4aee7287aaf8210be2f93bd0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 29 06:42:30 compute-1 sudo[221925]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:31.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:31 compute-1 python3.9[222079]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:32.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:32 compute-1 ceph-mon[80754]: pgmap v902: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:32 compute-1 python3.9[222231]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:33 compute-1 python3.9[222352]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398551.8813589-3838-180717272968298/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:33 compute-1 ceph-mon[80754]: pgmap v903: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:33.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:33 compute-1 python3.9[222502]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:34.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:34 compute-1 python3.9[222623]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398553.2543547-3883-41171828568607/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:35 compute-1 sudo[222773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyiewtvpfedokaahgfnkhtmllttkscua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398554.9251225-3936-70861693846628/AnsiballZ_container_config_data.py'
Nov 29 06:42:35 compute-1 sudo[222773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:35.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:35 compute-1 python3.9[222775]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 29 06:42:35 compute-1 sudo[222773]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:35 compute-1 ceph-mon[80754]: pgmap v904: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:36 compute-1 sudo[222925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwjryeweoihlnictijplkprqghvjmmva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398555.7412033-3961-161745886646905/AnsiballZ_container_config_hash.py'
Nov 29 06:42:36 compute-1 sudo[222925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:36.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:36 compute-1 python3.9[222927]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:42:36 compute-1 sudo[222925]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:37 compute-1 sudo[223077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpdsnouksihxtkeeyzileloawpkqthgs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398556.8380525-3991-268108661217859/AnsiballZ_edpm_container_manage.py'
Nov 29 06:42:37 compute-1 sudo[223077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:37.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:37 compute-1 python3[223079]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:42:37 compute-1 ceph-mon[80754]: pgmap v905: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:38.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:39.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:40 compute-1 ceph-mon[80754]: pgmap v906: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:40.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:41.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:41 compute-1 ceph-mon[80754]: pgmap v907: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:42.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:43.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:44.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:44 compute-1 sshd-session[223080]: Connection closed by 66.94.122.234 port 38024 [preauth]
Nov 29 06:42:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:45.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:46.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:46 compute-1 sudo[223154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:42:46 compute-1 sudo[223154]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:46 compute-1 sudo[223154]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:47 compute-1 sudo[223179]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:42:47 compute-1 sudo[223179]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:47 compute-1 sudo[223179]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:47 compute-1 sudo[223204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:42:47 compute-1 sudo[223204]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:47 compute-1 sudo[223204]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:47 compute-1 sudo[223229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:42:47 compute-1 sudo[223229]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:42:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:47.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:48.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:49 compute-1 ceph-mon[80754]: pgmap v908: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:49 compute-1 podman[223096]: 2025-11-29 06:42:49.347249554 +0000 UTC m=+11.803139062 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 06:42:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:42:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:49.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:42:49 compute-1 podman[223294]: 2025-11-29 06:42:49.504559895 +0000 UTC m=+0.049596065 container create 3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:42:49 compute-1 podman[223294]: 2025-11-29 06:42:49.473990189 +0000 UTC m=+0.019026379 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 06:42:49 compute-1 python3[223079]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 29 06:42:49 compute-1 sudo[223229]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:49 compute-1 sudo[223077]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:49 compute-1 ceph-mon[80754]: pgmap v909: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:49 compute-1 ceph-mon[80754]: pgmap v910: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:50.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:50 compute-1 podman[223369]: 2025-11-29 06:42:50.413388952 +0000 UTC m=+0.148976512 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 06:42:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:51 compute-1 sudo[223520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lughylozwhewqacfolggezvxzqvdurvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398570.8025951-4015-37307516872226/AnsiballZ_stat.py'
Nov 29 06:42:51 compute-1 sudo[223520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:51.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:51 compute-1 python3.9[223522]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:51 compute-1 sudo[223520]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:52.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:52 compute-1 sudo[223674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfiaygyvispqzcbovcuofhpnubnclnsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398572.2373002-4051-266694207767304/AnsiballZ_container_config_data.py'
Nov 29 06:42:52 compute-1 sudo[223674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:52 compute-1 python3.9[223676]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 29 06:42:52 compute-1 sudo[223674]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:53.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:53 compute-1 sudo[223828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrkqlwfdcogzopnlxmgakqprwfxpijef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398573.2435813-4078-197971615606283/AnsiballZ_container_config_hash.py'
Nov 29 06:42:53 compute-1 sudo[223828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:54 compute-1 python3.9[223830]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:42:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:54.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:54 compute-1 sudo[223828]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:54 compute-1 sshd-session[223794]: Invalid user hamed from 93.157.248.178 port 45394
Nov 29 06:42:54 compute-1 ceph-mon[80754]: pgmap v911: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:42:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:42:54 compute-1 ceph-mon[80754]: pgmap v912: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:42:54 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:42:54 compute-1 sshd-session[223794]: Received disconnect from 93.157.248.178 port 45394:11: Bye Bye [preauth]
Nov 29 06:42:54 compute-1 sshd-session[223794]: Disconnected from invalid user hamed 93.157.248.178 port 45394 [preauth]
Nov 29 06:42:54 compute-1 sudo[223980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnnrlaovfhwiahjxyssbaivzescrpudh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398574.5987787-4108-84998509823325/AnsiballZ_edpm_container_manage.py'
Nov 29 06:42:54 compute-1 sudo[223980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:55 compute-1 python3[223982]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:42:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:55.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:55 compute-1 podman[224020]: 2025-11-29 06:42:55.43103818 +0000 UTC m=+0.027957346 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 06:42:55 compute-1 podman[224020]: 2025-11-29 06:42:55.626125988 +0000 UTC m=+0.223045134 container create 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:42:55 compute-1 python3[223982]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 29 06:42:55 compute-1 sudo[223980]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:42:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:56.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:42:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:42:56 compute-1 podman[224083]: 2025-11-29 06:42:56.316160809 +0000 UTC m=+0.053372961 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 06:42:56 compute-1 podman[224102]: 2025-11-29 06:42:56.421127629 +0000 UTC m=+0.072993015 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:42:56 compute-1 ceph-mon[80754]: pgmap v913: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:56 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:42:56 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:42:56 compute-1 ceph-mon[80754]: pgmap v914: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:57 compute-1 sudo[224249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asheixmpteeedzzowvqtybdyryvmxqen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398576.8329632-4132-67093106746789/AnsiballZ_stat.py'
Nov 29 06:42:57 compute-1 sudo[224249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:57 compute-1 python3.9[224251]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:57 compute-1 sudo[224249]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:57.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:57 compute-1 ceph-mon[80754]: pgmap v915: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:58 compute-1 sudo[224403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uflkdgdjbmcxqdbodmmtplpwqtktzadq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398577.7909224-4159-71192870007432/AnsiballZ_file.py'
Nov 29 06:42:58 compute-1 sudo[224403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:58.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:58 compute-1 python3.9[224405]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:58 compute-1 sudo[224403]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:58 compute-1 sudo[224556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukywcoerdgikhonlffkyeqazhttjpnuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398578.319969-4159-27554632204550/AnsiballZ_copy.py'
Nov 29 06:42:58 compute-1 sudo[224556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:58 compute-1 sshd-session[224462]: Invalid user mysql from 71.70.164.48 port 41505
Nov 29 06:42:58 compute-1 sshd-session[224462]: Received disconnect from 71.70.164.48 port 41505:11: Bye Bye [preauth]
Nov 29 06:42:58 compute-1 sshd-session[224462]: Disconnected from invalid user mysql 71.70.164.48 port 41505 [preauth]
Nov 29 06:42:58 compute-1 python3.9[224558]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398578.319969-4159-27554632204550/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:58 compute-1 sudo[224556]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:59 compute-1 sudo[224632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jurxqpjfvkdosaftbthrqzxchhvyubmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398578.319969-4159-27554632204550/AnsiballZ_systemd.py'
Nov 29 06:42:59 compute-1 sudo[224632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:42:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:42:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:59.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:42:59 compute-1 python3.9[224634]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:42:59 compute-1 systemd[1]: Reloading.
Nov 29 06:42:59 compute-1 systemd-rc-local-generator[224662]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:42:59 compute-1 systemd-sysv-generator[224667]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:42:59 compute-1 ceph-mon[80754]: pgmap v916: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:42:59 compute-1 sudo[224632]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:00.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:00 compute-1 sudo[224744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqgkbfcoywettrwwrkdciubnqbptqkxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398578.319969-4159-27554632204550/AnsiballZ_systemd.py'
Nov 29 06:43:00 compute-1 sudo[224744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:00 compute-1 python3.9[224746]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:43:00 compute-1 systemd[1]: Reloading.
Nov 29 06:43:00 compute-1 systemd-sysv-generator[224782]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:43:00 compute-1 systemd-rc-local-generator[224779]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:43:00 compute-1 systemd[1]: Starting nova_compute container...
Nov 29 06:43:01 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:43:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:01 compute-1 podman[224787]: 2025-11-29 06:43:01.017983852 +0000 UTC m=+0.098440710 container init 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:43:01 compute-1 podman[224787]: 2025-11-29 06:43:01.029137351 +0000 UTC m=+0.109594209 container start 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 06:43:01 compute-1 podman[224787]: nova_compute
Nov 29 06:43:01 compute-1 nova_compute[224801]: + sudo -E kolla_set_configs
Nov 29 06:43:01 compute-1 systemd[1]: Started nova_compute container.
Nov 29 06:43:01 compute-1 sudo[224744]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:01 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Validating config file
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying service configuration files
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Deleting /etc/ceph
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Creating directory /etc/ceph
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Writing out command to execute
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:01 compute-1 nova_compute[224801]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:43:01 compute-1 nova_compute[224801]: ++ cat /run_command
Nov 29 06:43:01 compute-1 nova_compute[224801]: + CMD=nova-compute
Nov 29 06:43:01 compute-1 nova_compute[224801]: + ARGS=
Nov 29 06:43:01 compute-1 nova_compute[224801]: + sudo kolla_copy_cacerts
Nov 29 06:43:01 compute-1 nova_compute[224801]: + [[ ! -n '' ]]
Nov 29 06:43:01 compute-1 nova_compute[224801]: + . kolla_extend_start
Nov 29 06:43:01 compute-1 nova_compute[224801]: Running command: 'nova-compute'
Nov 29 06:43:01 compute-1 nova_compute[224801]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 06:43:01 compute-1 nova_compute[224801]: + umask 0022
Nov 29 06:43:01 compute-1 nova_compute[224801]: + exec nova-compute
Nov 29 06:43:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:01.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:01 compute-1 ceph-mon[80754]: pgmap v917: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:43:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:02.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:43:02 compute-1 python3.9[224963]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:43:03 compute-1 nova_compute[224801]: 2025-11-29 06:43:03.273 224805 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:03 compute-1 nova_compute[224801]: 2025-11-29 06:43:03.273 224805 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:03 compute-1 nova_compute[224801]: 2025-11-29 06:43:03.274 224805 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:03 compute-1 nova_compute[224801]: 2025-11-29 06:43:03.274 224805 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 29 06:43:03 compute-1 sudo[225002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:43:03 compute-1 sudo[225002]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:43:03 compute-1 sudo[225002]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:03.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:03 compute-1 sudo[225064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:43:03 compute-1 sudo[225064]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:43:03 compute-1 sudo[225064]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:03 compute-1 nova_compute[224801]: 2025-11-29 06:43:03.452 224805 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:43:03 compute-1 nova_compute[224801]: 2025-11-29 06:43:03.470 224805 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:43:03 compute-1 nova_compute[224801]: 2025-11-29 06:43:03.471 224805 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 29 06:43:03 compute-1 python3.9[225167]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:43:03 compute-1 ceph-mon[80754]: pgmap v918: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:43:03 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:43:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:04.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.256 224805 INFO nova.virt.driver [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.426 224805 INFO nova.compute.provider_config [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.441 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.442 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.442 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.443 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.443 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.443 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.443 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.444 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.444 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.444 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.444 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.444 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.445 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.445 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.445 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.445 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.445 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.446 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.446 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.446 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.446 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.447 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.447 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.447 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.447 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.447 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.448 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.448 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.448 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.448 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.448 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.449 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.449 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.449 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.449 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.450 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.450 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.450 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.450 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.450 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.451 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.451 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.451 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.451 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.451 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.452 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.452 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.452 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.452 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.453 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.453 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.453 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.453 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.454 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.454 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.454 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.454 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.455 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.455 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.455 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.455 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.455 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.456 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.456 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.456 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.456 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.456 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.457 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.457 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.457 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.457 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.457 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.458 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.458 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.458 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.458 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.458 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.459 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.459 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.459 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.459 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.459 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.460 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.460 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.460 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.460 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.461 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.461 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.461 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.462 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.462 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.462 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.463 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.463 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.463 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.464 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.464 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.464 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.464 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.465 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.465 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.465 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.466 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.466 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.466 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.466 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.467 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.467 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.467 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.467 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.468 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.468 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.468 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.468 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.469 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.469 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.469 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.469 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.470 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.470 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.470 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.471 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.471 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.471 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.472 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.472 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.472 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.472 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.472 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.473 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.473 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.473 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.473 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.473 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.474 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.474 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.474 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.474 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.474 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.475 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.475 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.475 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.475 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.476 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.476 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.476 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.476 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.476 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.477 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.477 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.477 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.477 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.478 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.478 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.478 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.478 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.479 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.479 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.479 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.479 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.479 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.480 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.480 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.480 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.480 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.481 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.481 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.481 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.481 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.481 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.482 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.482 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.482 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.482 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.482 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.483 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.483 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.483 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.483 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.483 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.484 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.484 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.484 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.484 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.485 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.485 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.485 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.485 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.485 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.486 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.486 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.486 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.486 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.487 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.487 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.487 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.487 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.488 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.488 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.488 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.488 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.488 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.489 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.489 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.489 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.489 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.489 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.490 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.490 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.490 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.490 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.491 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.491 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.491 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.491 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.491 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.492 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.492 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.492 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.492 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.492 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.493 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.493 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.493 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.493 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.494 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.494 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.494 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.494 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.494 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.495 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.495 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.495 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.495 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.496 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.496 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.496 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.496 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.496 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.497 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.497 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.497 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.497 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.497 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.498 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.498 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.498 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.498 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.499 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.499 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.499 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.499 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.499 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.500 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.500 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.500 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.500 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.501 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.501 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.501 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.501 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.502 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.502 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.502 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.502 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.502 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.503 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.503 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.503 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.503 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.503 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.504 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.504 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.504 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.504 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.504 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.505 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.505 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.505 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.505 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.506 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.506 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.506 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.506 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.506 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.507 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.507 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.507 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.507 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.508 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.508 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.508 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.508 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.509 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.509 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.509 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.509 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.509 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.510 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.510 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.510 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.510 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.510 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.511 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.511 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.511 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.511 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.512 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.512 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.512 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.512 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.512 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.513 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.513 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.513 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.513 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.514 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.514 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.514 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.514 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.514 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.515 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.515 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.515 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.515 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.516 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.516 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.516 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.516 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.516 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.517 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.517 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.517 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.517 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.517 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.518 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.518 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.518 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.518 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.519 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.519 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.519 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.519 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.519 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.520 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.520 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.520 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.520 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.520 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.521 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.521 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.521 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.521 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.522 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.522 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.522 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.522 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.522 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.523 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.523 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.523 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.524 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.524 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.524 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.524 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.525 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.525 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.525 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.525 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.525 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.526 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.526 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.526 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.526 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.526 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.527 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.527 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.527 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.527 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.527 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.528 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.528 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.528 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.528 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.528 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.529 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.529 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.529 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.529 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.529 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.530 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.530 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.530 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.530 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.531 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.531 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.531 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.531 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.531 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.532 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.532 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.532 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.532 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.532 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.533 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.533 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.533 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.533 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.534 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.534 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.534 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.534 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.534 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.535 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.535 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.535 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.535 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.535 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.536 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.536 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.536 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.536 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.537 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.537 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.537 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.537 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.537 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.538 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.538 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.538 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.538 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.538 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.539 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.539 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.539 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.539 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.539 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.540 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.540 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.540 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.540 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.541 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.541 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.541 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.541 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.541 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.542 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.542 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.542 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.542 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.542 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.543 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.543 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.543 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.543 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.544 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.544 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.544 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.544 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.544 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.545 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.545 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.545 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.545 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.546 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.546 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.546 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.546 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.546 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.547 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.547 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.547 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.547 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.547 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.548 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.548 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.548 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.548 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.548 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.549 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.549 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.549 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.549 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.550 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.550 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.550 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.550 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.550 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.551 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.551 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.551 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.551 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.551 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.552 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.552 224805 WARNING oslo_config.cfg [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 06:43:04 compute-1 nova_compute[224801]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 06:43:04 compute-1 nova_compute[224801]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 06:43:04 compute-1 nova_compute[224801]: and ``live_migration_inbound_addr`` respectively.
Nov 29 06:43:04 compute-1 nova_compute[224801]: ).  Its value may be silently ignored in the future.
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.552 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.553 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.553 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.553 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.553 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.554 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.554 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.554 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.554 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.555 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.555 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.555 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.555 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.555 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.556 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.556 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.556 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.556 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.557 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rbd_secret_uuid        = 336ec58c-893b-528f-a0c1-6ed1196bc047 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.557 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.557 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.557 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.557 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.558 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.558 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.558 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.558 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.559 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.559 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.559 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.559 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.560 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.560 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.560 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.560 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.561 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.561 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.561 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.561 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.561 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.562 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.562 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.562 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.562 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.562 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.563 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.563 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.563 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.563 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.564 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.564 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.564 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.564 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.564 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.565 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.565 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.565 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.565 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.565 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.566 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.566 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.566 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.566 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.566 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.567 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.567 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.567 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.567 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.567 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.568 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.568 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.568 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.568 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.568 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.569 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.569 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.569 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.569 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.571 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.571 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.571 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.571 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.572 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.572 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.572 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.572 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.573 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.573 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.573 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.573 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.574 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.574 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.574 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.574 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.574 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.575 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.575 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.575 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.575 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.575 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.576 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.576 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.576 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.576 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.576 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.577 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.577 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.577 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.577 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.577 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.578 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.578 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.578 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.578 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.578 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.579 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.579 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.579 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.579 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.580 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.580 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.580 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.580 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.580 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.581 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.581 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.581 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.581 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.582 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.582 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.582 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.582 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.582 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.583 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.583 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.583 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.583 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.584 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.584 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.584 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.584 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.584 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.585 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.585 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.585 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.585 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.586 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.586 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.586 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.586 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.586 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.587 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.587 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.587 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.587 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.588 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.588 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.588 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.588 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.588 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.589 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.589 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.589 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.589 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.589 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.591 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.591 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.591 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.591 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.599 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.599 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.599 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.599 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.599 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.601 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.601 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.601 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.601 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.601 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.602 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.602 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.602 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.602 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.602 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.603 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.603 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.603 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.603 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.613 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.613 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.613 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.613 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.613 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.617 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.617 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.617 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.617 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.617 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.629 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.629 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.629 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.629 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.629 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.630 224805 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.647 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.647 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.648 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.648 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 29 06:43:04 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 06:43:04 compute-1 systemd[1]: Started libvirt QEMU daemon.
Nov 29 06:43:04 compute-1 python3.9[225317]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.780 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd834e6c790> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.783 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd834e6c790> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.784 224805 INFO nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Connection event '1' reason 'None'
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.800 224805 WARNING nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Nov 29 06:43:04 compute-1 nova_compute[224801]: 2025-11-29 06:43:04.802 224805 DEBUG nova.virt.libvirt.volume.mount [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 29 06:43:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:05.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:05 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.707 224805 INFO nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 06:43:05 compute-1 nova_compute[224801]: 
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <host>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <uuid>6289b14c-9d0e-4084-a899-2566f6eb59ac</uuid>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <cpu>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <arch>x86_64</arch>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model>EPYC-Rome-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <vendor>AMD</vendor>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <microcode version='16777317'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <signature family='23' model='49' stepping='0'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='x2apic'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='tsc-deadline'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='osxsave'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='hypervisor'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='tsc_adjust'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='spec-ctrl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='stibp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='arch-capabilities'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='cmp_legacy'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='topoext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='virt-ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='lbrv'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='tsc-scale'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='vmcb-clean'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='pause-filter'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='pfthreshold'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='svme-addr-chk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='rdctl-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='skip-l1dfl-vmentry'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='mds-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature name='pschange-mc-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <pages unit='KiB' size='4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <pages unit='KiB' size='2048'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <pages unit='KiB' size='1048576'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </cpu>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <power_management>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <suspend_mem/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </power_management>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <iommu support='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <migration_features>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <live/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <uri_transports>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <uri_transport>tcp</uri_transport>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <uri_transport>rdma</uri_transport>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </uri_transports>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </migration_features>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <topology>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <cells num='1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <cell id='0'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:           <memory unit='KiB'>7864316</memory>
Nov 29 06:43:05 compute-1 nova_compute[224801]:           <pages unit='KiB' size='4'>1966079</pages>
Nov 29 06:43:05 compute-1 nova_compute[224801]:           <pages unit='KiB' size='2048'>0</pages>
Nov 29 06:43:05 compute-1 nova_compute[224801]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 29 06:43:05 compute-1 nova_compute[224801]:           <distances>
Nov 29 06:43:05 compute-1 nova_compute[224801]:             <sibling id='0' value='10'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:           </distances>
Nov 29 06:43:05 compute-1 nova_compute[224801]:           <cpus num='8'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:           </cpus>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         </cell>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </cells>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </topology>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <cache>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </cache>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <secmodel>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model>selinux</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <doi>0</doi>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </secmodel>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <secmodel>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model>dac</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <doi>0</doi>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </secmodel>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </host>
Nov 29 06:43:05 compute-1 nova_compute[224801]: 
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <guest>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <os_type>hvm</os_type>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <arch name='i686'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <wordsize>32</wordsize>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <domain type='qemu'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <domain type='kvm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </arch>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <features>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <pae/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <nonpae/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <acpi default='on' toggle='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <apic default='on' toggle='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <cpuselection/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <deviceboot/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <externalSnapshot/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </features>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </guest>
Nov 29 06:43:05 compute-1 nova_compute[224801]: 
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <guest>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <os_type>hvm</os_type>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <arch name='x86_64'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <wordsize>64</wordsize>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <domain type='qemu'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <domain type='kvm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </arch>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <features>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <acpi default='on' toggle='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <apic default='on' toggle='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <cpuselection/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <deviceboot/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <externalSnapshot/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </features>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </guest>
Nov 29 06:43:05 compute-1 nova_compute[224801]: 
Nov 29 06:43:05 compute-1 nova_compute[224801]: </capabilities>
Nov 29 06:43:05 compute-1 nova_compute[224801]: 
Nov 29 06:43:05 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.717 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:43:05 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.742 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 06:43:05 compute-1 nova_compute[224801]: <domainCapabilities>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <domain>kvm</domain>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <arch>i686</arch>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <vcpu max='240'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <iothreads supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <os supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <enum name='firmware'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <loader supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>rom</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pflash</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='readonly'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>yes</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>no</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='secure'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>no</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </loader>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </os>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <cpu>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>on</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>off</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='maximumMigratable'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>on</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>off</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <vendor>AMD</vendor>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='succor'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='custom' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:05 compute-1 sudo[225530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tawyfslddsvbpxcaranmdhzswwsmcqov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398585.2293272-4339-170854705785256/AnsiballZ_podman_container.py'
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 sudo[225530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-128'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-256'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-512'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='KnightsMill'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SierraForest'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='athlon'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='athlon-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='core2duo'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='core2duo-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='coreduo'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='coreduo-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='n270'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='n270-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='phenom'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='phenom-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </cpu>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <memoryBacking supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <enum name='sourceType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>file</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>anonymous</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>memfd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </memoryBacking>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <devices>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <disk supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='diskDevice'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>disk</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>cdrom</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>floppy</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>lun</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='bus'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>ide</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>fdc</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>scsi</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>sata</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </disk>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <graphics supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vnc</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>egl-headless</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>dbus</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </graphics>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <video supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='modelType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vga</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>cirrus</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>none</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>bochs</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>ramfb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </video>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <hostdev supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='mode'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>subsystem</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='startupPolicy'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>default</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>mandatory</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>requisite</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>optional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='subsysType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pci</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>scsi</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='capsType'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='pciBackend'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </hostdev>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <rng supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>random</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>egd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>builtin</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </rng>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <filesystem supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='driverType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>path</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>handle</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtiofs</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </filesystem>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <tpm supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tpm-tis</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tpm-crb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>emulator</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>external</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendVersion'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>2.0</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </tpm>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <redirdev supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='bus'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </redirdev>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <channel supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pty</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>unix</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </channel>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <crypto supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>qemu</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>builtin</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </crypto>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <interface supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>default</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>passt</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </interface>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <panic supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>isa</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>hyperv</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </panic>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <console supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>null</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vc</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pty</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>dev</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>file</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pipe</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>stdio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>udp</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tcp</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>unix</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>qemu-vdagent</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>dbus</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </console>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </devices>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <features>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <gic supported='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <genid supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <backup supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <async-teardown supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <ps2 supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <sev supported='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <sgx supported='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <hyperv supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='features'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>relaxed</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vapic</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>spinlocks</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vpindex</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>runtime</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>synic</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>stimer</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>reset</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vendor_id</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>frequencies</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>reenlightenment</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tlbflush</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>ipi</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>avic</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>emsr_bitmap</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>xmm_input</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <defaults>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </defaults>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </hyperv>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <launchSecurity supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='sectype'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tdx</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </launchSecurity>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </features>
Nov 29 06:43:05 compute-1 nova_compute[224801]: </domainCapabilities>
Nov 29 06:43:05 compute-1 nova_compute[224801]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:05 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.751 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 06:43:05 compute-1 nova_compute[224801]: <domainCapabilities>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <domain>kvm</domain>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <arch>i686</arch>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <vcpu max='4096'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <iothreads supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <os supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <enum name='firmware'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <loader supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>rom</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pflash</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='readonly'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>yes</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>no</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='secure'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>no</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </loader>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </os>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <cpu>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>on</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>off</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='maximumMigratable'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>on</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>off</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <vendor>AMD</vendor>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='succor'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='custom' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-128'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-256'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-512'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='KnightsMill'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SierraForest'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='athlon'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='athlon-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='core2duo'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='core2duo-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='coreduo'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='coreduo-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='n270'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='n270-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='phenom'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='phenom-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </cpu>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <memoryBacking supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <enum name='sourceType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>file</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>anonymous</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>memfd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </memoryBacking>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <devices>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <disk supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='diskDevice'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>disk</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>cdrom</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>floppy</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>lun</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='bus'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>fdc</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>scsi</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>sata</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </disk>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <graphics supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vnc</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>egl-headless</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>dbus</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </graphics>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <video supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='modelType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vga</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>cirrus</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>none</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>bochs</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>ramfb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </video>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <hostdev supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='mode'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>subsystem</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='startupPolicy'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>default</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>mandatory</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>requisite</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>optional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='subsysType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pci</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>scsi</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='capsType'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='pciBackend'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </hostdev>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <rng supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>random</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>egd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>builtin</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </rng>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <filesystem supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='driverType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>path</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>handle</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtiofs</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </filesystem>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <tpm supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tpm-tis</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tpm-crb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>emulator</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>external</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendVersion'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>2.0</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </tpm>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <redirdev supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='bus'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </redirdev>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <channel supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pty</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>unix</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </channel>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <crypto supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>qemu</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>builtin</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </crypto>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <interface supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>default</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>passt</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </interface>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <panic supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>isa</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>hyperv</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </panic>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <console supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>null</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vc</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pty</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>dev</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>file</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pipe</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>stdio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>udp</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tcp</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>unix</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>qemu-vdagent</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>dbus</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </console>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </devices>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <features>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <gic supported='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <genid supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <backup supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <async-teardown supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <ps2 supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <sev supported='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <sgx supported='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <hyperv supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='features'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>relaxed</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vapic</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>spinlocks</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vpindex</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>runtime</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>synic</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>stimer</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>reset</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vendor_id</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>frequencies</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>reenlightenment</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tlbflush</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>ipi</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>avic</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>emsr_bitmap</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>xmm_input</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <defaults>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </defaults>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </hyperv>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <launchSecurity supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='sectype'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tdx</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </launchSecurity>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </features>
Nov 29 06:43:05 compute-1 nova_compute[224801]: </domainCapabilities>
Nov 29 06:43:05 compute-1 nova_compute[224801]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:05 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.794 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:43:05 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.799 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 06:43:05 compute-1 nova_compute[224801]: <domainCapabilities>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <domain>kvm</domain>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <arch>x86_64</arch>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <vcpu max='240'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <iothreads supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <os supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <enum name='firmware'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <loader supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>rom</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pflash</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='readonly'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>yes</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>no</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='secure'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>no</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </loader>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </os>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <cpu>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>on</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>off</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='maximumMigratable'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>on</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>off</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <vendor>AMD</vendor>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='succor'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='custom' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-128'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-256'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-512'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='KnightsMill'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SierraForest'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 ceph-mon[80754]: pgmap v919: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='athlon'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='athlon-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='core2duo'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='core2duo-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='coreduo'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='coreduo-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='n270'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='n270-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='phenom'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='phenom-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </cpu>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <memoryBacking supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <enum name='sourceType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>file</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>anonymous</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>memfd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </memoryBacking>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <devices>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <disk supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='diskDevice'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>disk</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>cdrom</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>floppy</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>lun</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='bus'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>ide</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>fdc</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>scsi</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>sata</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </disk>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <graphics supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vnc</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>egl-headless</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>dbus</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </graphics>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <video supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='modelType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vga</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>cirrus</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>none</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>bochs</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>ramfb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </video>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <hostdev supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='mode'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>subsystem</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='startupPolicy'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>default</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>mandatory</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>requisite</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>optional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='subsysType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pci</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>scsi</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='capsType'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='pciBackend'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </hostdev>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <rng supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtio-non-transitional</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>random</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>egd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>builtin</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </rng>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <filesystem supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='driverType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>path</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>handle</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>virtiofs</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </filesystem>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <tpm supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tpm-tis</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tpm-crb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>emulator</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>external</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendVersion'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>2.0</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </tpm>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <redirdev supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='bus'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </redirdev>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <channel supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pty</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>unix</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </channel>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <crypto supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>qemu</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>builtin</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </crypto>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <interface supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='backendType'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>default</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>passt</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </interface>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <panic supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>isa</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>hyperv</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </panic>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <console supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>null</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vc</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pty</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>dev</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>file</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pipe</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>stdio</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>udp</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tcp</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>unix</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>qemu-vdagent</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>dbus</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </console>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </devices>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <features>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <gic supported='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <genid supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <backup supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <async-teardown supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <ps2 supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <sev supported='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <sgx supported='no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <hyperv supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='features'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>relaxed</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vapic</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>spinlocks</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vpindex</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>runtime</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>synic</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>stimer</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>reset</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>vendor_id</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>frequencies</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>reenlightenment</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tlbflush</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>ipi</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>avic</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>emsr_bitmap</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>xmm_input</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <defaults>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </defaults>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </hyperv>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <launchSecurity supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='sectype'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>tdx</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </launchSecurity>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </features>
Nov 29 06:43:05 compute-1 nova_compute[224801]: </domainCapabilities>
Nov 29 06:43:05 compute-1 nova_compute[224801]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:05 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.861 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 06:43:05 compute-1 nova_compute[224801]: <domainCapabilities>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <domain>kvm</domain>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <arch>x86_64</arch>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <vcpu max='4096'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <iothreads supported='yes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <os supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <enum name='firmware'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>efi</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <loader supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>rom</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>pflash</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='readonly'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>yes</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>no</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='secure'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>yes</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>no</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </loader>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   </os>
Nov 29 06:43:05 compute-1 nova_compute[224801]:   <cpu>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>on</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>off</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <enum name='maximumMigratable'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>on</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <value>off</value>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <vendor>AMD</vendor>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='succor'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:05 compute-1 nova_compute[224801]:     <mode name='custom' supported='yes'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Denverton-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='auto-ibrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amd-psfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='stibp-always-on'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='EPYC-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-128'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-256'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx10-512'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='prefetchiti'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Haswell-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='KnightsMill'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512er'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512pf'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fma4'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tbm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xop'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='amx-tile'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-bf16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-fp16'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bitalg'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrc'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fzrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='la57'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='taa-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xfd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SierraForest'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ifma'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cmpccxadd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fbsdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='fsrs'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='ibrs-all'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mcdt-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pbrsb-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='psdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='serialize'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vaes'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='hle'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='rtm'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512bw'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512cd'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512dq'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512f'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='avx512vl'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='invpcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pcid'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='pku'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='mpx'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='core-capability'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='split-lock-detect'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='cldemote'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='erms'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='gfni'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdir64b'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='movdiri'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='xsaves'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='athlon'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:05 compute-1 nova_compute[224801]:       <blockers model='athlon-v1'>
Nov 29 06:43:05 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <blockers model='core2duo'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <blockers model='core2duo-v1'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <blockers model='coreduo'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <blockers model='coreduo-v1'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <blockers model='n270'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <blockers model='n270-v1'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <feature name='ss'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <blockers model='phenom'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <blockers model='phenom-v1'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <feature name='3dnow'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <feature name='3dnowext'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </blockers>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </mode>
Nov 29 06:43:06 compute-1 nova_compute[224801]:   </cpu>
Nov 29 06:43:06 compute-1 nova_compute[224801]:   <memoryBacking supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <enum name='sourceType'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <value>file</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <value>anonymous</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <value>memfd</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:   </memoryBacking>
Nov 29 06:43:06 compute-1 nova_compute[224801]:   <devices>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <disk supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='diskDevice'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>disk</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>cdrom</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>floppy</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>lun</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='bus'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>fdc</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>scsi</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>sata</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>virtio-transitional</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>virtio-non-transitional</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </disk>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <graphics supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>vnc</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>egl-headless</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>dbus</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </graphics>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <video supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='modelType'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>vga</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>cirrus</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>none</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>bochs</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>ramfb</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </video>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <hostdev supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='mode'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>subsystem</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='startupPolicy'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>default</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>mandatory</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>requisite</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>optional</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='subsysType'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>pci</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>scsi</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='capsType'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='pciBackend'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </hostdev>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <rng supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>virtio</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>virtio-transitional</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>virtio-non-transitional</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>random</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>egd</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>builtin</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </rng>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <filesystem supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='driverType'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>path</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>handle</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>virtiofs</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </filesystem>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <tpm supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>tpm-tis</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>tpm-crb</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>emulator</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>external</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='backendVersion'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>2.0</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </tpm>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <redirdev supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='bus'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>usb</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </redirdev>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <channel supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>pty</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>unix</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </channel>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <crypto supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='model'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>qemu</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='backendModel'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>builtin</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </crypto>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <interface supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='backendType'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>default</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>passt</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </interface>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <panic supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='model'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>isa</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>hyperv</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </panic>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <console supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='type'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>null</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>vc</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>pty</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>dev</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>file</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>pipe</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>stdio</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>udp</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>tcp</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>unix</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>qemu-vdagent</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>dbus</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </console>
Nov 29 06:43:06 compute-1 nova_compute[224801]:   </devices>
Nov 29 06:43:06 compute-1 nova_compute[224801]:   <features>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <gic supported='no'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <genid supported='yes'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <backup supported='yes'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <async-teardown supported='yes'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <ps2 supported='yes'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <sev supported='no'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <sgx supported='no'/>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <hyperv supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='features'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>relaxed</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>vapic</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>spinlocks</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>vpindex</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>runtime</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>synic</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>stimer</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>reset</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>vendor_id</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>frequencies</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>reenlightenment</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>tlbflush</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>ipi</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>avic</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>emsr_bitmap</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>xmm_input</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <defaults>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </defaults>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </hyperv>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     <launchSecurity supported='yes'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       <enum name='sectype'>
Nov 29 06:43:06 compute-1 nova_compute[224801]:         <value>tdx</value>
Nov 29 06:43:06 compute-1 nova_compute[224801]:       </enum>
Nov 29 06:43:06 compute-1 nova_compute[224801]:     </launchSecurity>
Nov 29 06:43:06 compute-1 nova_compute[224801]:   </features>
Nov 29 06:43:06 compute-1 nova_compute[224801]: </domainCapabilities>
Nov 29 06:43:06 compute-1 nova_compute[224801]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.930 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.930 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.931 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.931 224805 INFO nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Secure Boot support detected
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.933 224805 INFO nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.933 224805 INFO nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.946 224805 DEBUG nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 06:43:06 compute-1 nova_compute[224801]:   <model>Nehalem</model>
Nov 29 06:43:06 compute-1 nova_compute[224801]: </cpu>
Nov 29 06:43:06 compute-1 nova_compute[224801]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.950 224805 DEBUG nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:05.984 224805 INFO nova.virt.node [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Determined node identity 774921e7-1fd5-4281-8c90-f7cd3ee5e01b from /var/lib/nova/compute_id
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.013 224805 WARNING nova.compute.manager [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Compute nodes ['774921e7-1fd5-4281-8c90-f7cd3ee5e01b'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 29 06:43:06 compute-1 python3.9[225533]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.061 224805 INFO nova.compute.manager [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 29 06:43:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:06.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:06 compute-1 sudo[225530]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.189 224805 WARNING nova.compute.manager [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.189 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.190 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.190 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.190 224805 DEBUG nova.compute.resource_tracker [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.191 224805 DEBUG oslo_concurrency.processutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:43:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:43:06 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3797343968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.649 224805 DEBUG oslo_concurrency.processutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:43:06 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:43:06 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 06:43:06 compute-1 systemd[1]: Started libvirt nodedev daemon.
Nov 29 06:43:06 compute-1 sudo[225749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccqaavhgiegmxpcanrnookwgsxowzgjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398586.436379-4363-9722227520443/AnsiballZ_systemd.py'
Nov 29 06:43:06 compute-1 sudo[225749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:06 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/3410839554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:06 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3797343968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.983 224805 WARNING nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.985 224805 DEBUG nova.compute.resource_tracker [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5341MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.985 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:06 compute-1 nova_compute[224801]: 2025-11-29 06:43:06.986 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:07 compute-1 nova_compute[224801]: 2025-11-29 06:43:07.039 224805 WARNING nova.compute.resource_tracker [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] No compute node record for compute-1.ctlplane.example.com:774921e7-1fd5-4281-8c90-f7cd3ee5e01b: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 774921e7-1fd5-4281-8c90-f7cd3ee5e01b could not be found.
Nov 29 06:43:07 compute-1 python3.9[225751]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:43:07 compute-1 systemd[1]: Stopping nova_compute container...
Nov 29 06:43:07 compute-1 nova_compute[224801]: 2025-11-29 06:43:07.244 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:07 compute-1 nova_compute[224801]: 2025-11-29 06:43:07.245 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:43:07 compute-1 nova_compute[224801]: 2025-11-29 06:43:07.245 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:43:07 compute-1 nova_compute[224801]: 2025-11-29 06:43:07.245 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:43:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:07.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:07 compute-1 virtqemud[225339]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 29 06:43:07 compute-1 virtqemud[225339]: hostname: compute-1
Nov 29 06:43:07 compute-1 virtqemud[225339]: End of file while reading data: Input/output error
Nov 29 06:43:07 compute-1 systemd[1]: libpod-31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955.scope: Deactivated successfully.
Nov 29 06:43:07 compute-1 systemd[1]: libpod-31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955.scope: Consumed 4.073s CPU time.
Nov 29 06:43:07 compute-1 podman[225757]: 2025-11-29 06:43:07.950661434 +0000 UTC m=+0.755315722 container died 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:43:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:08.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:08 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955-userdata-shm.mount: Deactivated successfully.
Nov 29 06:43:08 compute-1 systemd[1]: var-lib-containers-storage-overlay-545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48-merged.mount: Deactivated successfully.
Nov 29 06:43:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:09.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:10.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:11 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:11 compute-1 podman[225757]: 2025-11-29 06:43:11.302740628 +0000 UTC m=+4.107394926 container cleanup 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 06:43:11 compute-1 podman[225757]: nova_compute
Nov 29 06:43:11 compute-1 ceph-mon[80754]: pgmap v920: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:11 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2326258606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:11 compute-1 podman[225787]: nova_compute
Nov 29 06:43:11 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 29 06:43:11 compute-1 systemd[1]: Stopped nova_compute container.
Nov 29 06:43:11 compute-1 systemd[1]: Starting nova_compute container...
Nov 29 06:43:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:11.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:11 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:43:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:11 compute-1 podman[225800]: 2025-11-29 06:43:11.586750463 +0000 UTC m=+0.179945241 container init 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 06:43:11 compute-1 podman[225800]: 2025-11-29 06:43:11.5992946 +0000 UTC m=+0.192489358 container start 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 06:43:11 compute-1 podman[225800]: nova_compute
Nov 29 06:43:11 compute-1 nova_compute[225815]: + sudo -E kolla_set_configs
Nov 29 06:43:11 compute-1 systemd[1]: Started nova_compute container.
Nov 29 06:43:11 compute-1 sudo[225749]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Validating config file
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying service configuration files
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Deleting /etc/ceph
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Creating directory /etc/ceph
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Writing out command to execute
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:11 compute-1 nova_compute[225815]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:43:11 compute-1 nova_compute[225815]: ++ cat /run_command
Nov 29 06:43:11 compute-1 nova_compute[225815]: + CMD=nova-compute
Nov 29 06:43:11 compute-1 nova_compute[225815]: + ARGS=
Nov 29 06:43:11 compute-1 nova_compute[225815]: + sudo kolla_copy_cacerts
Nov 29 06:43:11 compute-1 nova_compute[225815]: + [[ ! -n '' ]]
Nov 29 06:43:11 compute-1 nova_compute[225815]: + . kolla_extend_start
Nov 29 06:43:11 compute-1 nova_compute[225815]: Running command: 'nova-compute'
Nov 29 06:43:11 compute-1 nova_compute[225815]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 06:43:11 compute-1 nova_compute[225815]: + umask 0022
Nov 29 06:43:11 compute-1 nova_compute[225815]: + exec nova-compute
Nov 29 06:43:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:12.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:12 compute-1 sudo[225976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abpuuwmassmrerftmnetvsdnkbawrrtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398591.9281042-4390-226112675810984/AnsiballZ_podman_container.py'
Nov 29 06:43:12 compute-1 sudo[225976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:12 compute-1 python3.9[225978]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 06:43:12 compute-1 ceph-mon[80754]: pgmap v921: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:12 compute-1 ceph-mon[80754]: pgmap v922: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:12 compute-1 systemd[1]: Started libpod-conmon-3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d.scope.
Nov 29 06:43:12 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:43:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dff9f09fc74a0a77d8626d02655dc1c388d815775966403bd82792086be1b196/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dff9f09fc74a0a77d8626d02655dc1c388d815775966403bd82792086be1b196/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dff9f09fc74a0a77d8626d02655dc1c388d815775966403bd82792086be1b196/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:12 compute-1 podman[226005]: 2025-11-29 06:43:12.820117997 +0000 UTC m=+0.151107221 container init 3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 06:43:12 compute-1 podman[226005]: 2025-11-29 06:43:12.8314125 +0000 UTC m=+0.162401704 container start 3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 29 06:43:12 compute-1 python3.9[225978]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Applying nova statedir ownership
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 29 06:43:12 compute-1 nova_compute_init[226026]: INFO:nova_statedir:Nova statedir ownership complete
Nov 29 06:43:12 compute-1 systemd[1]: libpod-3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d.scope: Deactivated successfully.
Nov 29 06:43:12 compute-1 podman[226041]: 2025-11-29 06:43:12.938148969 +0000 UTC m=+0.030547438 container died 3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0)
Nov 29 06:43:12 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d-userdata-shm.mount: Deactivated successfully.
Nov 29 06:43:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-dff9f09fc74a0a77d8626d02655dc1c388d815775966403bd82792086be1b196-merged.mount: Deactivated successfully.
Nov 29 06:43:12 compute-1 sudo[225976]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:12 compute-1 podman[226041]: 2025-11-29 06:43:12.989555074 +0000 UTC m=+0.081953543 container cleanup 3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm)
Nov 29 06:43:12 compute-1 systemd[1]: libpod-conmon-3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d.scope: Deactivated successfully.
Nov 29 06:43:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:13.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:13 compute-1 nova_compute[225815]: 2025-11-29 06:43:13.782 225819 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:13 compute-1 nova_compute[225815]: 2025-11-29 06:43:13.782 225819 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:13 compute-1 nova_compute[225815]: 2025-11-29 06:43:13.782 225819 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:43:13 compute-1 nova_compute[225815]: 2025-11-29 06:43:13.782 225819 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 29 06:43:13 compute-1 nova_compute[225815]: 2025-11-29 06:43:13.923 225819 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:43:13 compute-1 nova_compute[225815]: 2025-11-29 06:43:13.956 225819 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:43:13 compute-1 nova_compute[225815]: 2025-11-29 06:43:13.957 225819 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 29 06:43:14 compute-1 ceph-mon[80754]: pgmap v923: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:14.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:15 compute-1 ceph-mon[80754]: pgmap v924: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:15.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:43:15.911 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:43:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:43:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:16.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:16 compute-1 sshd-session[196810]: Connection closed by 192.168.122.30 port 32948
Nov 29 06:43:16 compute-1 sshd-session[196792]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:43:16 compute-1 systemd[1]: session-49.scope: Deactivated successfully.
Nov 29 06:43:16 compute-1 systemd[1]: session-49.scope: Consumed 2min 31.856s CPU time.
Nov 29 06:43:16 compute-1 systemd-logind[785]: Session 49 logged out. Waiting for processes to exit.
Nov 29 06:43:16 compute-1 systemd-logind[785]: Removed session 49.
Nov 29 06:43:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:17.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:17 compute-1 nova_compute[225815]: 2025-11-29 06:43:17.686 225819 INFO nova.virt.driver [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 29 06:43:17 compute-1 nova_compute[225815]: 2025-11-29 06:43:17.812 225819 INFO nova.compute.provider_config [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 29 06:43:17 compute-1 ceph-mon[80754]: pgmap v925: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:18.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.314 225819 DEBUG oslo_concurrency.lockutils [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.315 225819 DEBUG oslo_concurrency.lockutils [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.315 225819 DEBUG oslo_concurrency.lockutils [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.339 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.339 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.339 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.339 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.339 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.340 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.340 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.340 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.340 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.341 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.341 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.341 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.341 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.342 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.342 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.342 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.342 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.343 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.343 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.343 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.343 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.343 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.344 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.344 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.344 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.344 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.344 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.345 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.345 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.345 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.345 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.347 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.347 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.347 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.347 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.347 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.361 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.361 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.361 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.361 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.361 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.369 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.369 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.369 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.369 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.369 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.372 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.372 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.372 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.372 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.372 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.387 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.387 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.387 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.387 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.387 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.388 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.388 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.394 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.394 225819 WARNING oslo_config.cfg [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 06:43:19 compute-1 nova_compute[225815]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 06:43:19 compute-1 nova_compute[225815]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 06:43:19 compute-1 nova_compute[225815]: and ``live_migration_inbound_addr`` respectively.
Nov 29 06:43:19 compute-1 nova_compute[225815]: ).  Its value may be silently ignored in the future.
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.394 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.394 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.394 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rbd_secret_uuid        = 336ec58c-893b-528f-a0c1-6ed1196bc047 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.401 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.401 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.401 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.401 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.401 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.402 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.402 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.402 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.402 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.402 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.414 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.414 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.414 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.414 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.414 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.415 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.415 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.415 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.415 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.415 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.416 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.416 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.416 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.416 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.416 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.417 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.417 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.417 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.417 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.417 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.418 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.418 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.418 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.418 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.419 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.419 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.419 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.419 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.419 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.421 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.421 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.421 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.421 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.422 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.422 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.422 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.422 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.422 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.423 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.423 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.423 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.423 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.423 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.424 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.424 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.424 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.424 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.424 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.425 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.425 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.425 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.425 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.425 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.426 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.426 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.426 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.426 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.427 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.427 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.427 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.427 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.427 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.428 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.428 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.428 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.428 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.428 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.429 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.429 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.429 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.429 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.429 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.430 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.430 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.430 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.430 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.431 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.431 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.431 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.431 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.431 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.433 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.433 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.433 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.433 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.433 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.434 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.434 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.434 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.434 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:19.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.437 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.437 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.437 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.437 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.437 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.438 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.438 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.438 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.438 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.438 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.439 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.439 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.439 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.439 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.439 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.440 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.440 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.440 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.440 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.442 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.442 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.442 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.442 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.442 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.443 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.443 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.443 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.443 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.444 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.444 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.444 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.444 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.444 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.445 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.445 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.445 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.445 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.445 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.446 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.446 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.446 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.446 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.446 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.448 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.448 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.448 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.448 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.448 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.449 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.449 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.449 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.449 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.449 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.451 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.451 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.451 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.451 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.451 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.452 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.452 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.452 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.452 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.456 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.456 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.456 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.456 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.456 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.459 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.459 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.459 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.459 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.459 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.470 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.470 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:43:19 compute-1 nova_compute[225815]: 2025-11-29 06:43:19.471 225819 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 29 06:43:19 compute-1 ceph-mon[80754]: pgmap v926: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:20.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.222 225819 INFO nova.virt.node [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Determined node identity 774921e7-1fd5-4281-8c90-f7cd3ee5e01b from /var/lib/nova/compute_id
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.224 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.225 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.225 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.226 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.242 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd5043b5fa0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.245 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd5043b5fa0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.246 225819 INFO nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Connection event '1' reason 'None'
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.257 225819 INFO nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 06:43:20 compute-1 nova_compute[225815]: 
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <host>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <uuid>6289b14c-9d0e-4084-a899-2566f6eb59ac</uuid>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <cpu>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <arch>x86_64</arch>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model>EPYC-Rome-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <vendor>AMD</vendor>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <microcode version='16777317'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <signature family='23' model='49' stepping='0'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='x2apic'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='tsc-deadline'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='osxsave'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='hypervisor'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='tsc_adjust'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='spec-ctrl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='stibp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='arch-capabilities'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='cmp_legacy'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='topoext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='virt-ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='lbrv'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='tsc-scale'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='vmcb-clean'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='pause-filter'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='pfthreshold'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='svme-addr-chk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='rdctl-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='skip-l1dfl-vmentry'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='mds-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature name='pschange-mc-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <pages unit='KiB' size='4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <pages unit='KiB' size='2048'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <pages unit='KiB' size='1048576'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </cpu>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <power_management>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <suspend_mem/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </power_management>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <iommu support='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <migration_features>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <live/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <uri_transports>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <uri_transport>tcp</uri_transport>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <uri_transport>rdma</uri_transport>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </uri_transports>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </migration_features>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <topology>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <cells num='1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <cell id='0'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:           <memory unit='KiB'>7864316</memory>
Nov 29 06:43:20 compute-1 nova_compute[225815]:           <pages unit='KiB' size='4'>1966079</pages>
Nov 29 06:43:20 compute-1 nova_compute[225815]:           <pages unit='KiB' size='2048'>0</pages>
Nov 29 06:43:20 compute-1 nova_compute[225815]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 29 06:43:20 compute-1 nova_compute[225815]:           <distances>
Nov 29 06:43:20 compute-1 nova_compute[225815]:             <sibling id='0' value='10'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:           </distances>
Nov 29 06:43:20 compute-1 nova_compute[225815]:           <cpus num='8'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:           </cpus>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         </cell>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </cells>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </topology>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <cache>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </cache>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <secmodel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model>selinux</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <doi>0</doi>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </secmodel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <secmodel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model>dac</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <doi>0</doi>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </secmodel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </host>
Nov 29 06:43:20 compute-1 nova_compute[225815]: 
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <guest>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <os_type>hvm</os_type>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <arch name='i686'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <wordsize>32</wordsize>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <domain type='qemu'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <domain type='kvm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </arch>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <features>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <pae/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <nonpae/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <acpi default='on' toggle='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <apic default='on' toggle='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <cpuselection/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <deviceboot/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <externalSnapshot/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </features>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </guest>
Nov 29 06:43:20 compute-1 nova_compute[225815]: 
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <guest>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <os_type>hvm</os_type>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <arch name='x86_64'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <wordsize>64</wordsize>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <domain type='qemu'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <domain type='kvm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </arch>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <features>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <acpi default='on' toggle='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <apic default='on' toggle='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <cpuselection/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <deviceboot/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <externalSnapshot/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </features>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </guest>
Nov 29 06:43:20 compute-1 nova_compute[225815]: 
Nov 29 06:43:20 compute-1 nova_compute[225815]: </capabilities>
Nov 29 06:43:20 compute-1 nova_compute[225815]: 
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.264 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.269 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 06:43:20 compute-1 nova_compute[225815]: <domainCapabilities>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <domain>kvm</domain>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <arch>i686</arch>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <vcpu max='4096'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <iothreads supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <os supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <enum name='firmware'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <loader supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>rom</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pflash</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='readonly'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>yes</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>no</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='secure'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>no</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </loader>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </os>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <cpu>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>on</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>off</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='maximumMigratable'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>on</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>off</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <vendor>AMD</vendor>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='succor'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='custom' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='auto-ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='auto-ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-128'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-256'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-512'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='KnightsMill'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512er'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512pf'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512er'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512pf'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tbm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tbm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SierraForest'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cmpccxadd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cmpccxadd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='athlon'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='athlon-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='core2duo'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='core2duo-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='coreduo'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='coreduo-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='n270'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='n270-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='phenom'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='phenom-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </cpu>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <memoryBacking supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <enum name='sourceType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>file</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>anonymous</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>memfd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </memoryBacking>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <devices>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <disk supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='diskDevice'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>disk</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>cdrom</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>floppy</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>lun</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='bus'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>fdc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>scsi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>sata</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-non-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </disk>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <graphics supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vnc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>egl-headless</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dbus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </graphics>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <video supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='modelType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vga</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>cirrus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>none</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>bochs</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>ramfb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </video>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <hostdev supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='mode'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>subsystem</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='startupPolicy'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>default</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>mandatory</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>requisite</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>optional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='subsysType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pci</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>scsi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='capsType'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='pciBackend'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </hostdev>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <rng supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-non-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>random</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>egd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>builtin</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </rng>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <filesystem supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='driverType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>path</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>handle</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtiofs</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </filesystem>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <tpm supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tpm-tis</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tpm-crb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>emulator</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>external</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendVersion'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>2.0</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </tpm>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <redirdev supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='bus'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </redirdev>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <channel supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pty</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>unix</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </channel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <crypto supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>qemu</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>builtin</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </crypto>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <interface supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>default</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>passt</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </interface>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <panic supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>isa</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>hyperv</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </panic>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <console supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>null</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pty</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dev</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>file</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pipe</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>stdio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>udp</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tcp</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>unix</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>qemu-vdagent</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dbus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </console>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </devices>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <features>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <gic supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <genid supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <backup supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <async-teardown supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <ps2 supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <sev supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <sgx supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <hyperv supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='features'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>relaxed</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vapic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>spinlocks</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vpindex</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>runtime</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>synic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>stimer</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>reset</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vendor_id</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>frequencies</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>reenlightenment</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tlbflush</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>ipi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>avic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>emsr_bitmap</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>xmm_input</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <defaults>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </defaults>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </hyperv>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <launchSecurity supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='sectype'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tdx</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </launchSecurity>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </features>
Nov 29 06:43:20 compute-1 nova_compute[225815]: </domainCapabilities>
Nov 29 06:43:20 compute-1 nova_compute[225815]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.275 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 06:43:20 compute-1 nova_compute[225815]: <domainCapabilities>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <domain>kvm</domain>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <arch>i686</arch>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <vcpu max='240'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <iothreads supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <os supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <enum name='firmware'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <loader supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>rom</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pflash</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='readonly'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>yes</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>no</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='secure'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>no</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </loader>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </os>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <cpu>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>on</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>off</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='maximumMigratable'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>on</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>off</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <vendor>AMD</vendor>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='succor'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='custom' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='auto-ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='auto-ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-128'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-256'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-512'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='KnightsMill'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512er'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512pf'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512er'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512pf'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tbm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tbm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SierraForest'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cmpccxadd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cmpccxadd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='athlon'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='athlon-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='core2duo'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='core2duo-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='coreduo'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='coreduo-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='n270'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='n270-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='phenom'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='phenom-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </cpu>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <memoryBacking supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <enum name='sourceType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>file</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>anonymous</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>memfd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </memoryBacking>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <devices>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <disk supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='diskDevice'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>disk</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>cdrom</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>floppy</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>lun</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='bus'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>ide</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>fdc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>scsi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>sata</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-non-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </disk>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <graphics supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vnc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>egl-headless</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dbus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </graphics>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <video supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='modelType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vga</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>cirrus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>none</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>bochs</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>ramfb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </video>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <hostdev supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='mode'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>subsystem</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='startupPolicy'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>default</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>mandatory</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>requisite</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>optional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='subsysType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pci</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>scsi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='capsType'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='pciBackend'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </hostdev>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <rng supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-non-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>random</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>egd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>builtin</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </rng>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <filesystem supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='driverType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>path</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>handle</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtiofs</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </filesystem>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <tpm supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tpm-tis</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tpm-crb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>emulator</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>external</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendVersion'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>2.0</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </tpm>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <redirdev supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='bus'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </redirdev>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <channel supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pty</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>unix</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </channel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <crypto supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>qemu</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>builtin</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </crypto>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <interface supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>default</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>passt</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </interface>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <panic supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>isa</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>hyperv</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </panic>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <console supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>null</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pty</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dev</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>file</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pipe</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>stdio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>udp</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tcp</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>unix</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>qemu-vdagent</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dbus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </console>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </devices>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <features>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <gic supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <genid supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <backup supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <async-teardown supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <ps2 supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <sev supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <sgx supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <hyperv supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='features'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>relaxed</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vapic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>spinlocks</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vpindex</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>runtime</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>synic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>stimer</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>reset</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vendor_id</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>frequencies</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>reenlightenment</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tlbflush</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>ipi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>avic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>emsr_bitmap</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>xmm_input</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <defaults>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </defaults>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </hyperv>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <launchSecurity supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='sectype'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tdx</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </launchSecurity>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </features>
Nov 29 06:43:20 compute-1 nova_compute[225815]: </domainCapabilities>
Nov 29 06:43:20 compute-1 nova_compute[225815]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.326 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.331 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 06:43:20 compute-1 nova_compute[225815]: <domainCapabilities>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <domain>kvm</domain>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <arch>x86_64</arch>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <vcpu max='4096'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <iothreads supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <os supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <enum name='firmware'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>efi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <loader supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>rom</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pflash</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='readonly'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>yes</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>no</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='secure'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>yes</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>no</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </loader>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </os>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <cpu>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>on</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>off</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='maximumMigratable'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>on</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>off</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <vendor>AMD</vendor>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='succor'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='custom' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='auto-ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='auto-ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-128'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-256'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-512'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='KnightsMill'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512er'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512pf'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512er'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512pf'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tbm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tbm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SierraForest'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cmpccxadd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cmpccxadd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='athlon'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='athlon-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='core2duo'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='core2duo-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='coreduo'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='coreduo-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='n270'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='n270-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='phenom'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='phenom-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </cpu>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <memoryBacking supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <enum name='sourceType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>file</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>anonymous</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>memfd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </memoryBacking>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <devices>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <disk supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='diskDevice'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>disk</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>cdrom</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>floppy</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>lun</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='bus'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>fdc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>scsi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>sata</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-non-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </disk>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <graphics supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vnc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>egl-headless</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dbus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </graphics>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <video supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='modelType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vga</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>cirrus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>none</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>bochs</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>ramfb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </video>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <hostdev supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='mode'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>subsystem</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='startupPolicy'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>default</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>mandatory</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>requisite</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>optional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='subsysType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pci</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>scsi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='capsType'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='pciBackend'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </hostdev>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <rng supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-non-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>random</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>egd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>builtin</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </rng>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <filesystem supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='driverType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>path</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>handle</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtiofs</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </filesystem>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <tpm supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tpm-tis</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tpm-crb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>emulator</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>external</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendVersion'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>2.0</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </tpm>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <redirdev supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='bus'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </redirdev>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <channel supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pty</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>unix</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </channel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <crypto supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>qemu</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>builtin</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </crypto>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <interface supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>default</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>passt</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </interface>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <panic supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>isa</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>hyperv</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </panic>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <console supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>null</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pty</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dev</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>file</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pipe</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>stdio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>udp</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tcp</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>unix</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>qemu-vdagent</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dbus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </console>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </devices>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <features>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <gic supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <genid supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <backup supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <async-teardown supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <ps2 supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <sev supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <sgx supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <hyperv supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='features'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>relaxed</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vapic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>spinlocks</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vpindex</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>runtime</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>synic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>stimer</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>reset</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vendor_id</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>frequencies</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>reenlightenment</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tlbflush</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>ipi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>avic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>emsr_bitmap</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>xmm_input</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <defaults>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </defaults>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </hyperv>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <launchSecurity supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='sectype'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tdx</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </launchSecurity>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </features>
Nov 29 06:43:20 compute-1 nova_compute[225815]: </domainCapabilities>
Nov 29 06:43:20 compute-1 nova_compute[225815]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.399 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 06:43:20 compute-1 nova_compute[225815]: <domainCapabilities>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <domain>kvm</domain>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <arch>x86_64</arch>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <vcpu max='240'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <iothreads supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <os supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <enum name='firmware'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <loader supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>rom</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pflash</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='readonly'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>yes</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>no</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='secure'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>no</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </loader>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </os>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <cpu>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>on</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>off</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='maximum' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='maximumMigratable'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>on</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>off</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='host-model' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <vendor>AMD</vendor>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='x2apic'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='stibp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='succor'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='lbrv'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <mode name='custom' supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Broadwell-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Cooperlake-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Denverton-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Dhyana-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Genoa'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='auto-ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='auto-ibrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amd-psfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='no-nested-data-bp'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='null-sel-clr-base'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='stibp-always-on'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='EPYC-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-128'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-256'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx10-512'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='prefetchiti'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Haswell-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='IvyBridge-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='KnightsMill'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512er'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512pf'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='KnightsMill-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4fmaps'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-4vnniw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512er'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512pf'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tbm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fma4'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tbm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xop'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='amx-tile'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-bf16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-fp16'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bitalg'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vbmi2'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrc'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fzrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='la57'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='taa-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='tsx-ldtrk'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xfd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SierraForest'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cmpccxadd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='SierraForest-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ifma'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-ne-convert'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx-vnni-int8'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='bus-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cmpccxadd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fbsdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='fsrs'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ibrs-all'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mcdt-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pbrsb-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='psdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='serialize'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vaes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='vpclmulqdq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='hle'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='rtm'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512bw'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512cd'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512dq'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512f'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='avx512vl'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='invpcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pcid'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='pku'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='mpx'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v2'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v3'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='core-capability'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='split-lock-detect'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='Snowridge-v4'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='cldemote'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='erms'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='gfni'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdir64b'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='movdiri'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='xsaves'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='athlon'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='athlon-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='core2duo'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='core2duo-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='coreduo'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='coreduo-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='n270'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='n270-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='ss'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='phenom'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <blockers model='phenom-v1'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnow'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <feature name='3dnowext'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </blockers>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </mode>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </cpu>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <memoryBacking supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <enum name='sourceType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>file</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>anonymous</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <value>memfd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </memoryBacking>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <devices>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <disk supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='diskDevice'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>disk</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>cdrom</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>floppy</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>lun</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='bus'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>ide</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>fdc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>scsi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>sata</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-non-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </disk>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <graphics supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vnc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>egl-headless</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dbus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </graphics>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <video supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='modelType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vga</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>cirrus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>none</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>bochs</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>ramfb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </video>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <hostdev supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='mode'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>subsystem</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='startupPolicy'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>default</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>mandatory</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>requisite</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>optional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='subsysType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pci</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>scsi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='capsType'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='pciBackend'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </hostdev>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <rng supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtio-non-transitional</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>random</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>egd</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>builtin</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </rng>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <filesystem supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='driverType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>path</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>handle</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>virtiofs</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </filesystem>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <tpm supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tpm-tis</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tpm-crb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>emulator</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>external</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendVersion'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>2.0</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </tpm>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <redirdev supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='bus'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>usb</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </redirdev>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <channel supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pty</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>unix</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </channel>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <crypto supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>qemu</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendModel'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>builtin</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </crypto>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <interface supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='backendType'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>default</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>passt</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </interface>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <panic supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='model'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>isa</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>hyperv</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </panic>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <console supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='type'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>null</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vc</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pty</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dev</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>file</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>pipe</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>stdio</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>udp</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tcp</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>unix</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>qemu-vdagent</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>dbus</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </console>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </devices>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <features>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <gic supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <vmcoreinfo supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <genid supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <backingStoreInput supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <backup supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <async-teardown supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <ps2 supported='yes'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <sev supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <sgx supported='no'/>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <hyperv supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='features'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>relaxed</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vapic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>spinlocks</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vpindex</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>runtime</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>synic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>stimer</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>reset</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>vendor_id</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>frequencies</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>reenlightenment</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tlbflush</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>ipi</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>avic</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>emsr_bitmap</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>xmm_input</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <defaults>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <spinlocks>4095</spinlocks>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <stimer_direct>on</stimer_direct>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </defaults>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </hyperv>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     <launchSecurity supported='yes'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       <enum name='sectype'>
Nov 29 06:43:20 compute-1 nova_compute[225815]:         <value>tdx</value>
Nov 29 06:43:20 compute-1 nova_compute[225815]:       </enum>
Nov 29 06:43:20 compute-1 nova_compute[225815]:     </launchSecurity>
Nov 29 06:43:20 compute-1 nova_compute[225815]:   </features>
Nov 29 06:43:20 compute-1 nova_compute[225815]: </domainCapabilities>
Nov 29 06:43:20 compute-1 nova_compute[225815]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.469 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.469 225819 INFO nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Secure Boot support detected
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.472 225819 INFO nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.472 225819 INFO nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.496 225819 DEBUG nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 06:43:20 compute-1 nova_compute[225815]:   <model>Nehalem</model>
Nov 29 06:43:20 compute-1 nova_compute[225815]: </cpu>
Nov 29 06:43:20 compute-1 nova_compute[225815]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.499 225819 DEBUG nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 29 06:43:20 compute-1 nova_compute[225815]: 2025-11-29 06:43:20.636 225819 DEBUG nova.virt.libvirt.volume.mount [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 29 06:43:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:21 compute-1 podman[226114]: 2025-11-29 06:43:21.361479408 +0000 UTC m=+0.098659186 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 29 06:43:21 compute-1 nova_compute[225815]: 2025-11-29 06:43:21.394 225819 INFO nova.virt.node [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Determined node identity 774921e7-1fd5-4281-8c90-f7cd3ee5e01b from /var/lib/nova/compute_id
Nov 29 06:43:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:21.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:22 compute-1 nova_compute[225815]: 2025-11-29 06:43:22.028 225819 DEBUG nova.compute.manager [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Verified node 774921e7-1fd5-4281-8c90-f7cd3ee5e01b matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 29 06:43:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:22.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:22 compute-1 ceph-mon[80754]: pgmap v927: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:23.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:24 compute-1 nova_compute[225815]: 2025-11-29 06:43:24.070 225819 INFO nova.compute.manager [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 29 06:43:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:43:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:24.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:43:24 compute-1 ceph-mon[80754]: pgmap v928: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:24 compute-1 nova_compute[225815]: 2025-11-29 06:43:24.588 225819 ERROR nova.compute.manager [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Could not retrieve compute node resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '774921e7-1fd5-4281-8c90-f7cd3ee5e01b' not found: No resource provider with uuid 774921e7-1fd5-4281-8c90-f7cd3ee5e01b found  ", "request_id": "req-294ae9c0-9b11-49c9-9c76-3c1c56b62cdb"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '774921e7-1fd5-4281-8c90-f7cd3ee5e01b' not found: No resource provider with uuid 774921e7-1fd5-4281-8c90-f7cd3ee5e01b found  ", "request_id": "req-294ae9c0-9b11-49c9-9c76-3c1c56b62cdb"}]}
Nov 29 06:43:24 compute-1 nova_compute[225815]: 2025-11-29 06:43:24.987 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:24 compute-1 nova_compute[225815]: 2025-11-29 06:43:24.988 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:24 compute-1 nova_compute[225815]: 2025-11-29 06:43:24.988 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:24 compute-1 nova_compute[225815]: 2025-11-29 06:43:24.988 225819 DEBUG nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:43:24 compute-1 nova_compute[225815]: 2025-11-29 06:43:24.989 225819 DEBUG oslo_concurrency.processutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:43:25 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/1216257790' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:25 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/766159143' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:25 compute-1 ceph-mon[80754]: pgmap v929: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:25.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:25 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:43:25 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3359582458' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:25 compute-1 nova_compute[225815]: 2025-11-29 06:43:25.470 225819 DEBUG oslo_concurrency.processutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:43:25 compute-1 nova_compute[225815]: 2025-11-29 06:43:25.620 225819 WARNING nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:43:25 compute-1 nova_compute[225815]: 2025-11-29 06:43:25.621 225819 DEBUG nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5302MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:43:25 compute-1 nova_compute[225815]: 2025-11-29 06:43:25.621 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:25 compute-1 nova_compute[225815]: 2025-11-29 06:43:25.621 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:26.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:26 compute-1 nova_compute[225815]: 2025-11-29 06:43:26.229 225819 ERROR nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '774921e7-1fd5-4281-8c90-f7cd3ee5e01b' not found: No resource provider with uuid 774921e7-1fd5-4281-8c90-f7cd3ee5e01b found  ", "request_id": "req-20388899-591a-493f-912c-230384eb308c"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '774921e7-1fd5-4281-8c90-f7cd3ee5e01b' not found: No resource provider with uuid 774921e7-1fd5-4281-8c90-f7cd3ee5e01b found  ", "request_id": "req-20388899-591a-493f-912c-230384eb308c"}]}
Nov 29 06:43:26 compute-1 nova_compute[225815]: 2025-11-29 06:43:26.229 225819 DEBUG nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:43:26 compute-1 nova_compute[225815]: 2025-11-29 06:43:26.230 225819 DEBUG nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:43:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:26 compute-1 nova_compute[225815]: 2025-11-29 06:43:26.412 225819 INFO nova.scheduler.client.report [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] [req-6d2ae49c-8926-4686-92df-ac92fd289c8a] Created resource provider record via placement API for resource provider with UUID 774921e7-1fd5-4281-8c90-f7cd3ee5e01b and name compute-1.ctlplane.example.com.
Nov 29 06:43:26 compute-1 nova_compute[225815]: 2025-11-29 06:43:26.450 225819 DEBUG oslo_concurrency.processutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:43:26 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3359582458' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:26 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2300023521' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:43:26 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/570139792' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:26 compute-1 nova_compute[225815]: 2025-11-29 06:43:26.896 225819 DEBUG oslo_concurrency.processutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:43:26 compute-1 nova_compute[225815]: 2025-11-29 06:43:26.902 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 29 06:43:26 compute-1 nova_compute[225815]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 29 06:43:26 compute-1 nova_compute[225815]: 2025-11-29 06:43:26.903 225819 INFO nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] kernel doesn't support AMD SEV
Nov 29 06:43:26 compute-1 nova_compute[225815]: 2025-11-29 06:43:26.904 225819 DEBUG nova.compute.provider_tree [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Updating inventory in ProviderTree for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:43:26 compute-1 nova_compute[225815]: 2025-11-29 06:43:26.905 225819 DEBUG nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:43:26 compute-1 nova_compute[225815]: 2025-11-29 06:43:26.910 225819 DEBUG nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt baseline CPU <cpu>
Nov 29 06:43:26 compute-1 nova_compute[225815]:   <arch>x86_64</arch>
Nov 29 06:43:26 compute-1 nova_compute[225815]:   <model>Nehalem</model>
Nov 29 06:43:26 compute-1 nova_compute[225815]:   <vendor>AMD</vendor>
Nov 29 06:43:26 compute-1 nova_compute[225815]:   <topology sockets="8" cores="1" threads="1"/>
Nov 29 06:43:26 compute-1 nova_compute[225815]: </cpu>
Nov 29 06:43:26 compute-1 nova_compute[225815]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Nov 29 06:43:27 compute-1 podman[226186]: 2025-11-29 06:43:27.330010481 +0000 UTC m=+0.061837446 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 29 06:43:27 compute-1 podman[226185]: 2025-11-29 06:43:27.363131978 +0000 UTC m=+0.092499135 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:43:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:27.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:27 compute-1 nova_compute[225815]: 2025-11-29 06:43:27.469 225819 DEBUG nova.scheduler.client.report [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Updated inventory for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 29 06:43:27 compute-1 nova_compute[225815]: 2025-11-29 06:43:27.470 225819 DEBUG nova.compute.provider_tree [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Updating resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 29 06:43:27 compute-1 nova_compute[225815]: 2025-11-29 06:43:27.470 225819 DEBUG nova.compute.provider_tree [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Updating inventory in ProviderTree for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:43:27 compute-1 nova_compute[225815]: 2025-11-29 06:43:27.719 225819 DEBUG nova.compute.provider_tree [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Updating resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 29 06:43:27 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2410258570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:27 compute-1 ceph-mon[80754]: pgmap v930: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:27 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/570139792' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:43:27 compute-1 nova_compute[225815]: 2025-11-29 06:43:27.855 225819 DEBUG nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:43:27 compute-1 nova_compute[225815]: 2025-11-29 06:43:27.855 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:27 compute-1 nova_compute[225815]: 2025-11-29 06:43:27.856 225819 DEBUG nova.service [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 29 06:43:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:28.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:28 compute-1 nova_compute[225815]: 2025-11-29 06:43:28.636 225819 DEBUG nova.service [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 29 06:43:28 compute-1 nova_compute[225815]: 2025-11-29 06:43:28.637 225819 DEBUG nova.servicegroup.drivers.db [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 29 06:43:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:29.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:30 compute-1 ceph-mon[80754]: pgmap v931: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:30.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:31.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:31 compute-1 ceph-mon[80754]: pgmap v932: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 06:43:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:32.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 06:43:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:33.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:33 compute-1 ceph-mon[80754]: pgmap v933: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:34.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:35.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:36.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:36 compute-1 ceph-mon[80754]: pgmap v934: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:37 compute-1 ceph-mon[80754]: pgmap v935: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:37.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:38.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:39.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:40.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:40 compute-1 ceph-mon[80754]: pgmap v936: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:41 compute-1 ceph-mon[80754]: pgmap v937: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:41.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:42.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:43.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:43 compute-1 nova_compute[225815]: 2025-11-29 06:43:43.640 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:43 compute-1 nova_compute[225815]: 2025-11-29 06:43:43.819 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:43 compute-1 ceph-mon[80754]: pgmap v938: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:44.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:45.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:45 compute-1 ceph-mon[80754]: pgmap v939: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:46.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:47 compute-1 ceph-mon[80754]: pgmap v940: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:47.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:48.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:49 compute-1 ceph-mon[80754]: pgmap v941: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:49.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:50.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:51.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:51 compute-1 ceph-mon[80754]: pgmap v942: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:52.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:52 compute-1 podman[226221]: 2025-11-29 06:43:52.382515134 +0000 UTC m=+0.110752562 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:43:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:53.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:53 compute-1 ceph-mon[80754]: pgmap v943: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:54.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:55.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:55 compute-1 ceph-mon[80754]: pgmap v944: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:43:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:56.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:43:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:43:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:57.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:57 compute-1 ceph-mon[80754]: pgmap v945: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:43:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:58.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:58 compute-1 podman[226249]: 2025-11-29 06:43:58.331466562 +0000 UTC m=+0.065536327 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 29 06:43:58 compute-1 podman[226248]: 2025-11-29 06:43:58.349342318 +0000 UTC m=+0.084410981 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:43:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:43:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:43:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:59.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:43:59 compute-1 sshd-session[226286]: Invalid user local from 93.157.248.178 port 38996
Nov 29 06:43:59 compute-1 sshd-session[226286]: Received disconnect from 93.157.248.178 port 38996:11: Bye Bye [preauth]
Nov 29 06:43:59 compute-1 sshd-session[226286]: Disconnected from invalid user local 93.157.248.178 port 38996 [preauth]
Nov 29 06:44:00 compute-1 ceph-mon[80754]: pgmap v946: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:00.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:01 compute-1 sshd-session[226220]: error: kex_exchange_identification: read: Connection timed out
Nov 29 06:44:01 compute-1 sshd-session[226220]: banner exchange: Connection from 119.45.242.7 port 54404: Connection timed out
Nov 29 06:44:01 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:01.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:02.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:02 compute-1 ceph-mon[80754]: pgmap v947: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:03.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:03 compute-1 sudo[226288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:03 compute-1 sudo[226288]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:03 compute-1 sudo[226288]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:03 compute-1 ceph-mon[80754]: pgmap v948: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:03 compute-1 sudo[226313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:44:03 compute-1 sudo[226313]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:03 compute-1 sudo[226313]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:03 compute-1 sudo[226338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:03 compute-1 sudo[226338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:03 compute-1 sudo[226338]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:03 compute-1 sudo[226363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:44:03 compute-1 sudo[226363]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:04.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:04 compute-1 sudo[226363]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:44:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:44:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:44:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:44:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:44:04 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:44:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:05.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:05 compute-1 ceph-mon[80754]: pgmap v949: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:06.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:07.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:07 compute-1 ceph-mon[80754]: pgmap v950: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:08.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:09.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:09 compute-1 ceph-mon[80754]: pgmap v951: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:10.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:11 compute-1 sudo[226418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:44:11 compute-1 sudo[226418]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:11 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:11 compute-1 sudo[226418]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:11 compute-1 sudo[226443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:44:11 compute-1 sudo[226443]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:44:11 compute-1 sudo[226443]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:11.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:12 compute-1 ceph-mon[80754]: pgmap v952: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:44:12 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:44:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:12.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:13.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:13 compute-1 nova_compute[225815]: 2025-11-29 06:44:13.969 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:13 compute-1 nova_compute[225815]: 2025-11-29 06:44:13.970 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:13 compute-1 nova_compute[225815]: 2025-11-29 06:44:13.970 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:44:13 compute-1 nova_compute[225815]: 2025-11-29 06:44:13.970 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:44:14 compute-1 ceph-mon[80754]: pgmap v953: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:14.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:15.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:44:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:44:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:44:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:44:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:44:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:44:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:16.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:16 compute-1 ceph-mon[80754]: pgmap v954: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:17 compute-1 ceph-mon[80754]: pgmap v955: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:17 compute-1 nova_compute[225815]: 2025-11-29 06:44:17.438 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.79 sec
Nov 29 06:44:17 compute-1 nova_compute[225815]: 2025-11-29 06:44:17.458 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:44:17 compute-1 nova_compute[225815]: 2025-11-29 06:44:17.458 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-1 nova_compute[225815]: 2025-11-29 06:44:17.459 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-1 nova_compute[225815]: 2025-11-29 06:44:17.460 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-1 nova_compute[225815]: 2025-11-29 06:44:17.460 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-1 nova_compute[225815]: 2025-11-29 06:44:17.461 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-1 nova_compute[225815]: 2025-11-29 06:44:17.461 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-1 nova_compute[225815]: 2025-11-29 06:44:17.462 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:44:17 compute-1 nova_compute[225815]: 2025-11-29 06:44:17.462 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:17.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:18.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:19.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:19 compute-1 ceph-mon[80754]: pgmap v956: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:20.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:21.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:22.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:22 compute-1 ceph-mon[80754]: pgmap v957: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:23 compute-1 podman[226468]: 2025-11-29 06:44:23.396177547 +0000 UTC m=+0.134508903 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:44:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:23.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:24.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:25 compute-1 ceph-mon[80754]: pgmap v958: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:25.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:26 compute-1 ceph-mon[80754]: pgmap v959: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:26.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:27.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:28 compute-1 ceph-mon[80754]: pgmap v960: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:28 compute-1 sshd-session[226494]: Received disconnect from 66.94.122.234 port 60664:11: Bye Bye [preauth]
Nov 29 06:44:28 compute-1 sshd-session[226494]: Disconnected from authenticating user root 66.94.122.234 port 60664 [preauth]
Nov 29 06:44:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:28.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:29 compute-1 podman[226497]: 2025-11-29 06:44:29.34131914 +0000 UTC m=+0.072476628 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:44:29 compute-1 podman[226496]: 2025-11-29 06:44:29.345979235 +0000 UTC m=+0.082721684 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:44:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:29.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:29 compute-1 ceph-mon[80754]: pgmap v961: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:30.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:31.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:32 compute-1 ceph-mon[80754]: pgmap v962: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:32.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:33.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:33 compute-1 ceph-mon[80754]: pgmap v963: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:34.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:35.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:36 compute-1 nova_compute[225815]: 2025-11-29 06:44:36.075 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:44:36 compute-1 nova_compute[225815]: 2025-11-29 06:44:36.075 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:44:36 compute-1 nova_compute[225815]: 2025-11-29 06:44:36.076 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:44:36 compute-1 nova_compute[225815]: 2025-11-29 06:44:36.076 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:44:36 compute-1 nova_compute[225815]: 2025-11-29 06:44:36.076 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:44:36 compute-1 nova_compute[225815]: 2025-11-29 06:44:36.100 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 8.66 sec
Nov 29 06:44:36 compute-1 ceph-mon[80754]: pgmap v964: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:36.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:44:36 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2284150235' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:36 compute-1 nova_compute[225815]: 2025-11-29 06:44:36.547 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:44:36 compute-1 nova_compute[225815]: 2025-11-29 06:44:36.761 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:44:36 compute-1 nova_compute[225815]: 2025-11-29 06:44:36.763 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5347MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:44:36 compute-1 nova_compute[225815]: 2025-11-29 06:44:36.764 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:44:36 compute-1 nova_compute[225815]: 2025-11-29 06:44:36.765 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:44:37 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2267221102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:37 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/2284150235' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:37 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/1970020033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:37.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:38 compute-1 ceph-mon[80754]: pgmap v965: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:38.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:39 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1438398855' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:44:39 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1438398855' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:44:39 compute-1 ceph-mon[80754]: pgmap v966: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:39 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:44:39 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4273924820' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:44:39 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:44:39 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4273924820' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:44:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:39.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:40 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/4273924820' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:44:40 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/4273924820' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:44:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:40.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:40 compute-1 nova_compute[225815]: 2025-11-29 06:44:40.465 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:44:40 compute-1 nova_compute[225815]: 2025-11-29 06:44:40.466 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:44:40 compute-1 nova_compute[225815]: 2025-11-29 06:44:40.518 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.937957) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680938056, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2761, "num_deletes": 509, "total_data_size": 6471047, "memory_usage": 6558432, "flush_reason": "Manual Compaction"}
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Nov 29 06:44:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:44:40 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/945539473' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:40 compute-1 nova_compute[225815]: 2025-11-29 06:44:40.966 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680971582, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 4239117, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15259, "largest_seqno": 18015, "table_properties": {"data_size": 4228438, "index_size": 6469, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3141, "raw_key_size": 23481, "raw_average_key_size": 18, "raw_value_size": 4205271, "raw_average_value_size": 3380, "num_data_blocks": 289, "num_entries": 1244, "num_filter_entries": 1244, "num_deletions": 509, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398425, "oldest_key_time": 1764398425, "file_creation_time": 1764398680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 33688 microseconds, and 14984 cpu microseconds.
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.971644) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 4239117 bytes OK
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.971663) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.974319) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.974346) EVENT_LOG_v1 {"time_micros": 1764398680974340, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.974365) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 6458171, prev total WAL file size 6458171, number of live WAL files 2.
Nov 29 06:44:40 compute-1 nova_compute[225815]: 2025-11-29 06:44:40.975 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.976778) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323535' seq:0, type:0; will stop at (end)
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(4139KB)], [30(9302KB)]
Nov 29 06:44:40 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680976866, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 13764972, "oldest_snapshot_seqno": -1}
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4723 keys, 11178332 bytes, temperature: kUnknown
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681050359, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11178332, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11142868, "index_size": 22554, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11845, "raw_key_size": 118497, "raw_average_key_size": 25, "raw_value_size": 11053537, "raw_average_value_size": 2340, "num_data_blocks": 935, "num_entries": 4723, "num_filter_entries": 4723, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.050717) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11178332 bytes
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.052885) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.0 rd, 151.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 9.1 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(5.9) write-amplify(2.6) OK, records in: 5757, records dropped: 1034 output_compression: NoCompression
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.052914) EVENT_LOG_v1 {"time_micros": 1764398681052903, "job": 16, "event": "compaction_finished", "compaction_time_micros": 73621, "compaction_time_cpu_micros": 27643, "output_level": 6, "num_output_files": 1, "total_output_size": 11178332, "num_input_records": 5757, "num_output_records": 4723, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681053720, "job": 16, "event": "table_file_deletion", "file_number": 32}
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681055825, "job": 16, "event": "table_file_deletion", "file_number": 30}
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.976646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.055941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.055951) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.055955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.055959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.055962) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:44:41 compute-1 nova_compute[225815]: 2025-11-29 06:44:41.287 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:44:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:41.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:42.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:42 compute-1 ceph-mon[80754]: pgmap v967: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:42 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/945539473' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:42 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/10673063' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:42 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3344413637' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:44:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:43.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:43 compute-1 ceph-mon[80754]: pgmap v968: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:44.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:45.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:45 compute-1 ceph-mon[80754]: pgmap v969: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:46.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:46 compute-1 nova_compute[225815]: 2025-11-29 06:44:46.818 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:44:46 compute-1 nova_compute[225815]: 2025-11-29 06:44:46.819 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 10.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:44:47 compute-1 ceph-mon[80754]: pgmap v970: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:47.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:47 compute-1 sshd-session[226574]: Invalid user steam from 119.45.242.7 port 36936
Nov 29 06:44:48 compute-1 sshd-session[226574]: Received disconnect from 119.45.242.7 port 36936:11: Bye Bye [preauth]
Nov 29 06:44:48 compute-1 sshd-session[226574]: Disconnected from invalid user steam 119.45.242.7 port 36936 [preauth]
Nov 29 06:44:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:48.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:49.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:50 compute-1 ceph-mon[80754]: pgmap v971: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:50.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:51.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:51 compute-1 ceph-mon[80754]: pgmap v972: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:52.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:53 compute-1 ceph-mon[80754]: pgmap v973: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:53.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:54.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:54 compute-1 podman[226576]: 2025-11-29 06:44:54.374892613 +0000 UTC m=+0.116178468 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:44:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:55.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:56 compute-1 ceph-mon[80754]: pgmap v974: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:56.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:44:57 compute-1 ceph-mon[80754]: pgmap v975: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:44:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:44:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:57.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:44:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:44:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:58.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:44:58 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1524599570' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:44:58 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1524599570' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:44:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:44:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:44:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:59.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:44:59 compute-1 ceph-mon[80754]: pgmap v976: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:00.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:00 compute-1 podman[226602]: 2025-11-29 06:45:00.316645186 +0000 UTC m=+0.053582138 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 06:45:00 compute-1 podman[226601]: 2025-11-29 06:45:00.342591346 +0000 UTC m=+0.082144419 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible)
Nov 29 06:45:01 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:01 compute-1 ceph-mon[80754]: pgmap v977: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:01.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:01 compute-1 anacron[30895]: Job `cron.weekly' started
Nov 29 06:45:01 compute-1 anacron[30895]: Job `cron.weekly' terminated
Nov 29 06:45:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:02.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.882140) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702882266, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 458, "num_deletes": 251, "total_data_size": 609015, "memory_usage": 618624, "flush_reason": "Manual Compaction"}
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702887441, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 401811, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18020, "largest_seqno": 18473, "table_properties": {"data_size": 399311, "index_size": 600, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6105, "raw_average_key_size": 18, "raw_value_size": 394334, "raw_average_value_size": 1209, "num_data_blocks": 28, "num_entries": 326, "num_filter_entries": 326, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398682, "oldest_key_time": 1764398682, "file_creation_time": 1764398702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 5431 microseconds, and 1922 cpu microseconds.
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.887584) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 401811 bytes OK
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.887632) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889239) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889256) EVENT_LOG_v1 {"time_micros": 1764398702889252, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889273) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 606194, prev total WAL file size 606194, number of live WAL files 2.
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.890121) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(392KB)], [33(10MB)]
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702890232, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 11580143, "oldest_snapshot_seqno": -1}
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4538 keys, 9460877 bytes, temperature: kUnknown
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702984972, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 9460877, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9428135, "index_size": 20280, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11397, "raw_key_size": 115324, "raw_average_key_size": 25, "raw_value_size": 9343388, "raw_average_value_size": 2058, "num_data_blocks": 832, "num_entries": 4538, "num_filter_entries": 4538, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.985453) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 9460877 bytes
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.987501) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 122.1 rd, 99.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 10.7 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(52.4) write-amplify(23.5) OK, records in: 5049, records dropped: 511 output_compression: NoCompression
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.987517) EVENT_LOG_v1 {"time_micros": 1764398702987510, "job": 18, "event": "compaction_finished", "compaction_time_micros": 94872, "compaction_time_cpu_micros": 24867, "output_level": 6, "num_output_files": 1, "total_output_size": 9460877, "num_input_records": 5049, "num_output_records": 4538, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702987678, "job": 18, "event": "table_file_deletion", "file_number": 35}
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702989534, "job": 18, "event": "table_file_deletion", "file_number": 33}
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889949) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.989634) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.989641) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.989644) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.989646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:02 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.989649) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:45:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:03.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:03 compute-1 ceph-mon[80754]: pgmap v978: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:04.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:05.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:05 compute-1 sshd-session[226644]: Received disconnect from 93.157.248.178 port 55240:11: Bye Bye [preauth]
Nov 29 06:45:05 compute-1 sshd-session[226644]: Disconnected from authenticating user root 93.157.248.178 port 55240 [preauth]
Nov 29 06:45:05 compute-1 ceph-mon[80754]: pgmap v979: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:06.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:07.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:08 compute-1 ceph-mon[80754]: pgmap v980: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:08.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:09.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:10 compute-1 ceph-mon[80754]: pgmap v981: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:10.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:11 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:11 compute-1 sudo[226646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:11 compute-1 sudo[226646]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:11 compute-1 sudo[226646]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:11.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:11 compute-1 sudo[226671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:45:11 compute-1 sudo[226671]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:11 compute-1 sudo[226671]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:11 compute-1 sudo[226696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:11 compute-1 sudo[226696]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:11 compute-1 sudo[226696]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:11 compute-1 sudo[226721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:45:11 compute-1 sudo[226721]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:12 compute-1 ceph-mon[80754]: pgmap v982: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:12 compute-1 sudo[226721]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:12.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:45:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:45:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:45:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:45:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:45:13 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:45:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:13.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:14.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:14 compute-1 ceph-mon[80754]: pgmap v983: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:15 compute-1 ceph-mon[80754]: pgmap v984: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:15.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:45:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:45:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:45:15.913 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:45:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:45:15.913 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:45:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:16.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:17.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:17 compute-1 sshd-session[226778]: Invalid user halo from 71.70.164.48 port 40719
Nov 29 06:45:17 compute-1 sshd-session[226778]: Received disconnect from 71.70.164.48 port 40719:11: Bye Bye [preauth]
Nov 29 06:45:17 compute-1 sshd-session[226778]: Disconnected from invalid user halo 71.70.164.48 port 40719 [preauth]
Nov 29 06:45:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:18.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:18 compute-1 ceph-mon[80754]: pgmap v985: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:19.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:19 compute-1 ceph-mon[80754]: pgmap v986: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:20.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:21 compute-1 ceph-mon[80754]: pgmap v987: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:21.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:22.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:23 compute-1 sudo[226780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:45:23 compute-1 sudo[226780]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:23 compute-1 sudo[226780]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:23.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:23 compute-1 sudo[226805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:45:23 compute-1 sudo[226805]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:45:23 compute-1 sudo[226805]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:24.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:24 compute-1 ceph-mon[80754]: pgmap v988: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:45:24 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:45:25 compute-1 ceph-mon[80754]: pgmap v989: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:25 compute-1 podman[226830]: 2025-11-29 06:45:25.37498302 +0000 UTC m=+0.106182948 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:45:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:25.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:26.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:27.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:27 compute-1 ceph-mon[80754]: pgmap v990: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:28.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:29.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:29 compute-1 ceph-mon[80754]: pgmap v991: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:30.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:31 compute-1 podman[226855]: 2025-11-29 06:45:31.332874827 +0000 UTC m=+0.066170568 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 06:45:31 compute-1 podman[226854]: 2025-11-29 06:45:31.332878757 +0000 UTC m=+0.071472201 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:45:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:31.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:32 compute-1 ceph-mon[80754]: pgmap v992: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:32.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:33 compute-1 ceph-mon[80754]: pgmap v993: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:33.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:34.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:35.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:36 compute-1 ceph-mon[80754]: pgmap v994: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:36.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:37 compute-1 ceph-mon[80754]: pgmap v995: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:37.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:38.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:39.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:39 compute-1 ceph-mon[80754]: pgmap v996: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:40.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:41.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:41 compute-1 ceph-mon[80754]: pgmap v997: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:42.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:43.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:43 compute-1 ceph-mon[80754]: pgmap v998: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:44.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:45.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:45 compute-1 ceph-mon[80754]: pgmap v999: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:46.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:46 compute-1 nova_compute[225815]: 2025-11-29 06:45:46.811 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:46 compute-1 nova_compute[225815]: 2025-11-29 06:45:46.812 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:47.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:48 compute-1 ceph-mon[80754]: pgmap v1000: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:48.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:49 compute-1 ceph-mon[80754]: pgmap v1001: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:49.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:50.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:51.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:51 compute-1 ceph-mon[80754]: pgmap v1002: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:45:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:52.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:45:52 compute-1 nova_compute[225815]: 2025-11-29 06:45:52.887 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:52 compute-1 nova_compute[225815]: 2025-11-29 06:45:52.887 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:45:52 compute-1 nova_compute[225815]: 2025-11-29 06:45:52.888 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:45:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:53.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:53 compute-1 ceph-mon[80754]: pgmap v1003: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:54.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:55 compute-1 nova_compute[225815]: 2025-11-29 06:45:55.563 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:45:55 compute-1 nova_compute[225815]: 2025-11-29 06:45:55.563 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-1 nova_compute[225815]: 2025-11-29 06:45:55.563 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-1 nova_compute[225815]: 2025-11-29 06:45:55.564 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-1 nova_compute[225815]: 2025-11-29 06:45:55.564 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-1 nova_compute[225815]: 2025-11-29 06:45:55.564 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-1 nova_compute[225815]: 2025-11-29 06:45:55.564 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-1 nova_compute[225815]: 2025-11-29 06:45:55.564 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:45:55 compute-1 nova_compute[225815]: 2025-11-29 06:45:55.565 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:45:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:55.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:45:55 compute-1 ceph-mon[80754]: pgmap v1004: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:45:56 compute-1 podman[226892]: 2025-11-29 06:45:56.362584674 +0000 UTC m=+0.098918952 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 06:45:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:56.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:57.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:57 compute-1 ceph-mon[80754]: pgmap v1005: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:58.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:45:59 compute-1 ceph-mon[80754]: pgmap v1006: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:45:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:45:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:45:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:59.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:00.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:01 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:01.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:02 compute-1 ceph-mon[80754]: pgmap v1007: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:02 compute-1 podman[226920]: 2025-11-29 06:46:02.334725556 +0000 UTC m=+0.070166115 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:46:02 compute-1 podman[226919]: 2025-11-29 06:46:02.352536787 +0000 UTC m=+0.084915814 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:46:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:02.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:03 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1002726251' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:46:03 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1002726251' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:46:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:03.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:04 compute-1 ceph-mon[80754]: pgmap v1008: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:04.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:05.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:06 compute-1 ceph-mon[80754]: pgmap v1009: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:06.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:07.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:08 compute-1 ceph-mon[80754]: pgmap v1010: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:08.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:09 compute-1 nova_compute[225815]: 2025-11-29 06:46:09.008 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:46:09 compute-1 nova_compute[225815]: 2025-11-29 06:46:09.009 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:46:09 compute-1 nova_compute[225815]: 2025-11-29 06:46:09.009 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:46:09 compute-1 nova_compute[225815]: 2025-11-29 06:46:09.010 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:46:09 compute-1 nova_compute[225815]: 2025-11-29 06:46:09.010 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:46:09 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:46:09 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3671393982' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:09 compute-1 nova_compute[225815]: 2025-11-29 06:46:09.470 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:46:09 compute-1 nova_compute[225815]: 2025-11-29 06:46:09.554 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.37 sec
Nov 29 06:46:09 compute-1 nova_compute[225815]: 2025-11-29 06:46:09.687 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:46:09 compute-1 nova_compute[225815]: 2025-11-29 06:46:09.688 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5358MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:46:09 compute-1 nova_compute[225815]: 2025-11-29 06:46:09.689 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:46:09 compute-1 nova_compute[225815]: 2025-11-29 06:46:09.689 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:46:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:09.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:10 compute-1 ceph-mon[80754]: pgmap v1011: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:10 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2861061763' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:10 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3232813680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:10 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3671393982' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:10.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:10 compute-1 nova_compute[225815]: 2025-11-29 06:46:10.600 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:46:10 compute-1 nova_compute[225815]: 2025-11-29 06:46:10.601 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:46:10 compute-1 nova_compute[225815]: 2025-11-29 06:46:10.647 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:46:11 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:46:11 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1964139985' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:11 compute-1 nova_compute[225815]: 2025-11-29 06:46:11.288 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.641s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:46:11 compute-1 nova_compute[225815]: 2025-11-29 06:46:11.295 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:46:11 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:11 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/616557685' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:11.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:12 compute-1 sshd-session[226999]: Received disconnect from 93.157.248.178 port 37422:11: Bye Bye [preauth]
Nov 29 06:46:12 compute-1 sshd-session[226999]: Disconnected from authenticating user root 93.157.248.178 port 37422 [preauth]
Nov 29 06:46:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:12.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:12 compute-1 nova_compute[225815]: 2025-11-29 06:46:12.441 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:46:12 compute-1 nova_compute[225815]: 2025-11-29 06:46:12.443 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:46:12 compute-1 nova_compute[225815]: 2025-11-29 06:46:12.444 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:46:12 compute-1 ceph-mon[80754]: pgmap v1012: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:12 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3071034240' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:12 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1964139985' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:46:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:13.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:13 compute-1 ceph-mon[80754]: pgmap v1013: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:14.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:15.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:15 compute-1 ceph-mon[80754]: pgmap v1014: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:46:15.913 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:46:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:46:15.914 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:46:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:46:15.914 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:46:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:16.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:17.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:17 compute-1 ceph-mon[80754]: pgmap v1015: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:18.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:19.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:19 compute-1 ceph-mon[80754]: pgmap v1016: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:20.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:21.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:21 compute-1 ceph-mon[80754]: pgmap v1017: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:22.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:23.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:23 compute-1 sudo[227002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:23 compute-1 sudo[227002]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:23 compute-1 sudo[227002]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:23 compute-1 sudo[227027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:46:23 compute-1 sudo[227027]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:23 compute-1 sudo[227027]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:24 compute-1 sudo[227052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:24 compute-1 sudo[227052]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:24 compute-1 sudo[227052]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:24 compute-1 sudo[227077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:46:24 compute-1 sudo[227077]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:24 compute-1 ceph-mon[80754]: pgmap v1018: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:24.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:24 compute-1 sudo[227077]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:46:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:46:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:46:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:46:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:46:25 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:46:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:25.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:26 compute-1 ceph-mon[80754]: pgmap v1019: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:26.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:27 compute-1 podman[227133]: 2025-11-29 06:46:27.370282977 +0000 UTC m=+0.104059475 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 06:46:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:27.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:28 compute-1 ceph-mon[80754]: pgmap v1020: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:28.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:29 compute-1 ceph-mon[80754]: pgmap v1021: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:29.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:30.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:31.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:32 compute-1 ceph-mon[80754]: pgmap v1022: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:32.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:33 compute-1 sudo[227161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:46:33 compute-1 sudo[227161]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:33 compute-1 sudo[227161]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:33 compute-1 sudo[227198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:46:33 compute-1 podman[227185]: 2025-11-29 06:46:33.078809299 +0000 UTC m=+0.057922131 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 29 06:46:33 compute-1 sudo[227198]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:46:33 compute-1 sudo[227198]: pam_unix(sudo:session): session closed for user root
Nov 29 06:46:33 compute-1 podman[227186]: 2025-11-29 06:46:33.086771951 +0000 UTC m=+0.061043733 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:46:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:46:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:33.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:46:33 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:46:33 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:46:33 compute-1 ceph-mon[80754]: pgmap v1023: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:34.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:35 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:46:35.706 139246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:05:03', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'aa:d2:09:dd:a5:e1'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:46:35 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:46:35.708 139246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:46:35 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:46:35.709 139246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2fa83236-07b6-4ff7-bb56-9f4f13bed719, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:46:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:35.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:36 compute-1 ceph-mon[80754]: pgmap v1024: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:36.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:37 compute-1 ceph-mon[80754]: pgmap v1025: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:37.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:38.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:39.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:40.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:40 compute-1 ceph-mon[80754]: pgmap v1026: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:41 compute-1 ceph-mon[80754]: pgmap v1027: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:41.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:42.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:43 compute-1 ceph-mon[80754]: pgmap v1028: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:43.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:44.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:45.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:46 compute-1 ceph-mon[80754]: pgmap v1029: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:46.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:47.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:47 compute-1 ceph-mon[80754]: pgmap v1030: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:48.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:49.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:50 compute-1 ceph-mon[80754]: pgmap v1031: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:50.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:51 compute-1 ceph-mon[80754]: pgmap v1032: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:51.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:52.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:53.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:54 compute-1 ceph-mon[80754]: pgmap v1033: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:54.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:55 compute-1 ceph-mon[80754]: pgmap v1034: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:55.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:46:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:56.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:46:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:57.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:46:58 compute-1 podman[227249]: 2025-11-29 06:46:58.363396504 +0000 UTC m=+0.089220657 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 06:46:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:46:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:58.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:46:58 compute-1 ceph-mon[80754]: pgmap v1035: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:46:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:46:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:46:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:59.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:46:59 compute-1 ceph-mon[80754]: pgmap v1036: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:00.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:01 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:01.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:02 compute-1 ceph-mon[80754]: pgmap v1037: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:02.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:03 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/3066267713' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:47:03 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/3066267713' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:47:03 compute-1 ceph-mon[80754]: pgmap v1038: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:03 compute-1 podman[227275]: 2025-11-29 06:47:03.328336487 +0000 UTC m=+0.059671897 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 06:47:03 compute-1 podman[227276]: 2025-11-29 06:47:03.354688172 +0000 UTC m=+0.079903408 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 06:47:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:03.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:04.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:05.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:06 compute-1 ceph-mon[80754]: pgmap v1039: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:06.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:07.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:07 compute-1 ceph-mon[80754]: pgmap v1040: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:08.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:09 compute-1 ceph-mon[80754]: pgmap v1041: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:09.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:10.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:11 compute-1 ceph-mon[80754]: pgmap v1042: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:11.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:11 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:12 compute-1 nova_compute[225815]: 2025-11-29 06:47:12.446 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:12 compute-1 nova_compute[225815]: 2025-11-29 06:47:12.446 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:12 compute-1 nova_compute[225815]: 2025-11-29 06:47:12.447 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:47:12 compute-1 nova_compute[225815]: 2025-11-29 06:47:12.447 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:47:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:12.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:13 compute-1 nova_compute[225815]: 2025-11-29 06:47:13.712 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:47:13 compute-1 nova_compute[225815]: 2025-11-29 06:47:13.712 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-1 nova_compute[225815]: 2025-11-29 06:47:13.714 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-1 nova_compute[225815]: 2025-11-29 06:47:13.714 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-1 nova_compute[225815]: 2025-11-29 06:47:13.714 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-1 nova_compute[225815]: 2025-11-29 06:47:13.715 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-1 nova_compute[225815]: 2025-11-29 06:47:13.715 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-1 nova_compute[225815]: 2025-11-29 06:47:13.715 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:47:13 compute-1 nova_compute[225815]: 2025-11-29 06:47:13.715 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:13.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:14 compute-1 ceph-mon[80754]: pgmap v1043: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:14 compute-1 nova_compute[225815]: 2025-11-29 06:47:14.056 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:14 compute-1 nova_compute[225815]: 2025-11-29 06:47:14.057 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:14 compute-1 nova_compute[225815]: 2025-11-29 06:47:14.057 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:14 compute-1 nova_compute[225815]: 2025-11-29 06:47:14.058 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:47:14 compute-1 nova_compute[225815]: 2025-11-29 06:47:14.058 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:47:14 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2186349468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:14.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:14 compute-1 nova_compute[225815]: 2025-11-29 06:47:14.527 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:14 compute-1 nova_compute[225815]: 2025-11-29 06:47:14.686 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:47:14 compute-1 nova_compute[225815]: 2025-11-29 06:47:14.687 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5369MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:47:14 compute-1 nova_compute[225815]: 2025-11-29 06:47:14.688 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:14 compute-1 nova_compute[225815]: 2025-11-29 06:47:14.688 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:15 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1028868458' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:15 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/2186349468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:15 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/3335706329' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:15.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:15 compute-1 nova_compute[225815]: 2025-11-29 06:47:15.866 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:47:15 compute-1 nova_compute[225815]: 2025-11-29 06:47:15.867 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:47:15 compute-1 nova_compute[225815]: 2025-11-29 06:47:15.887 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:47:15.915 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:47:15.916 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:47:15.916 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:16 compute-1 ceph-mon[80754]: pgmap v1044: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:47:16 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1939421518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:16 compute-1 nova_compute[225815]: 2025-11-29 06:47:16.405 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:16 compute-1 nova_compute[225815]: 2025-11-29 06:47:16.414 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:47:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:16.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:16 compute-1 nova_compute[225815]: 2025-11-29 06:47:16.546 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:47:16 compute-1 nova_compute[225815]: 2025-11-29 06:47:16.548 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:47:16 compute-1 nova_compute[225815]: 2025-11-29 06:47:16.548 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:17 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1939421518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:17 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/209762820' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:17 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1847130721' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:17.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.064 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.065 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.234 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.235 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.235 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.251 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.251 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.251 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.252 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.252 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.252 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.252 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.252 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.379 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.380 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.380 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.380 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.381 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:18 compute-1 ceph-mon[80754]: pgmap v1045: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:18.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:18 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:47:18 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1602380716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:18 compute-1 nova_compute[225815]: 2025-11-29 06:47:18.826 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:19 compute-1 nova_compute[225815]: 2025-11-29 06:47:19.013 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:47:19 compute-1 nova_compute[225815]: 2025-11-29 06:47:19.014 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5372MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:47:19 compute-1 nova_compute[225815]: 2025-11-29 06:47:19.014 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:19 compute-1 nova_compute[225815]: 2025-11-29 06:47:19.014 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:19 compute-1 nova_compute[225815]: 2025-11-29 06:47:19.497 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:47:19 compute-1 nova_compute[225815]: 2025-11-29 06:47:19.498 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:47:19 compute-1 nova_compute[225815]: 2025-11-29 06:47:19.518 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:19 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/687770592' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:19 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1602380716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:19 compute-1 ceph-mon[80754]: pgmap v1046: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:47:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:19.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:47:19 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:47:19 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2414150286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:19 compute-1 nova_compute[225815]: 2025-11-29 06:47:19.994 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:20 compute-1 nova_compute[225815]: 2025-11-29 06:47:20.001 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:47:20 compute-1 nova_compute[225815]: 2025-11-29 06:47:20.473 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:47:20 compute-1 nova_compute[225815]: 2025-11-29 06:47:20.474 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:47:20 compute-1 nova_compute[225815]: 2025-11-29 06:47:20.474 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.460s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:20.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:20 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/728103463' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:20 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/2414150286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:21 compute-1 nova_compute[225815]: 2025-11-29 06:47:21.189 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:21 compute-1 ceph-mon[80754]: pgmap v1047: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:21.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:22.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:22 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2547278201' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:23 compute-1 sshd-session[227405]: Invalid user botuser from 93.157.248.178 port 46782
Nov 29 06:47:23 compute-1 sshd-session[227405]: Received disconnect from 93.157.248.178 port 46782:11: Bye Bye [preauth]
Nov 29 06:47:23 compute-1 sshd-session[227405]: Disconnected from invalid user botuser 93.157.248.178 port 46782 [preauth]
Nov 29 06:47:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:23.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:24 compute-1 ceph-mon[80754]: pgmap v1048: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:24.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:25 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3841899284' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:47:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:25.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:26 compute-1 ceph-mon[80754]: pgmap v1049: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:26.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:27 compute-1 ceph-mon[80754]: pgmap v1050: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:27.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:28.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:29 compute-1 podman[227407]: 2025-11-29 06:47:29.334369411 +0000 UTC m=+0.079348763 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:47:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:29.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:30 compute-1 ceph-mon[80754]: pgmap v1051: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:30.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:31.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:32 compute-1 ceph-mon[80754]: pgmap v1052: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:32.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:33 compute-1 sudo[227433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:33 compute-1 sudo[227433]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:33 compute-1 sudo[227433]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:33 compute-1 sudo[227458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:47:33 compute-1 sudo[227458]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:33 compute-1 sudo[227458]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:33 compute-1 sudo[227495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:33 compute-1 sudo[227495]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:33 compute-1 sudo[227495]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:33 compute-1 podman[227482]: 2025-11-29 06:47:33.491165855 +0000 UTC m=+0.077761282 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 06:47:33 compute-1 podman[227483]: 2025-11-29 06:47:33.499454167 +0000 UTC m=+0.086790523 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 29 06:47:33 compute-1 sudo[227546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:47:33 compute-1 sudo[227546]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:33.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:33 compute-1 sudo[227546]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:34 compute-1 ceph-mon[80754]: pgmap v1053: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:34.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:47:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:47:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:47:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:47:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:47:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:47:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:47:35 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:47:35 compute-1 ceph-mon[80754]: pgmap v1054: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:35.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:36.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:37.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:38 compute-1 ceph-mon[80754]: pgmap v1055: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:38.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:39.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:40 compute-1 ceph-mon[80754]: pgmap v1056: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:40.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:41 compute-1 ceph-mon[80754]: pgmap v1057: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:41 compute-1 sshd-session[227604]: Invalid user elasticsearch from 71.70.164.48 port 39296
Nov 29 06:47:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:41 compute-1 sshd-session[227604]: Received disconnect from 71.70.164.48 port 39296:11: Bye Bye [preauth]
Nov 29 06:47:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:41.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:41 compute-1 sshd-session[227604]: Disconnected from invalid user elasticsearch 71.70.164.48 port 39296 [preauth]
Nov 29 06:47:42 compute-1 sudo[227606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:47:42 compute-1 sudo[227606]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:42 compute-1 sudo[227606]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:42.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:42 compute-1 sudo[227631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:47:42 compute-1 sudo[227631]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:47:42 compute-1 sudo[227631]: pam_unix(sudo:session): session closed for user root
Nov 29 06:47:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:47:43 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:47:43 compute-1 ceph-mon[80754]: pgmap v1058: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:43.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:44.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:45.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:46 compute-1 ceph-mon[80754]: pgmap v1059: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:46.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:47 compute-1 ceph-mon[80754]: pgmap v1060: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:47.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:48.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:49.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:50 compute-1 ceph-mon[80754]: pgmap v1061: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:50.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:51 compute-1 ceph-mon[80754]: pgmap v1062: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:51.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:52.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:53.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:54 compute-1 ceph-mon[80754]: pgmap v1063: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:54.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:55 compute-1 ceph-mon[80754]: pgmap v1064: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:55.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:56.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.077007) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877077087, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1876, "num_deletes": 251, "total_data_size": 4555047, "memory_usage": 4605016, "flush_reason": "Manual Compaction"}
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877091720, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1737351, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18479, "largest_seqno": 20349, "table_properties": {"data_size": 1731823, "index_size": 2668, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14351, "raw_average_key_size": 20, "raw_value_size": 1719513, "raw_average_value_size": 2428, "num_data_blocks": 123, "num_entries": 708, "num_filter_entries": 708, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398703, "oldest_key_time": 1764398703, "file_creation_time": 1764398877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 14781 microseconds, and 6320 cpu microseconds.
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.091787) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1737351 bytes OK
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.091810) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.093739) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.093759) EVENT_LOG_v1 {"time_micros": 1764398877093754, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.093778) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 4546656, prev total WAL file size 4546656, number of live WAL files 2.
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.095146) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353033' seq:72057594037927935, type:22 .. '6D67727374617400373535' seq:0, type:0; will stop at (end)
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1696KB)], [36(9239KB)]
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877095208, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 11198228, "oldest_snapshot_seqno": -1}
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4812 keys, 8586310 bytes, temperature: kUnknown
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877163574, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 8586310, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8554107, "index_size": 19101, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12037, "raw_key_size": 121248, "raw_average_key_size": 25, "raw_value_size": 8466904, "raw_average_value_size": 1759, "num_data_blocks": 783, "num_entries": 4812, "num_filter_entries": 4812, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.163893) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 8586310 bytes
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.165342) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 163.5 rd, 125.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 9.0 +0.0 blob) out(8.2 +0.0 blob), read-write-amplify(11.4) write-amplify(4.9) OK, records in: 5246, records dropped: 434 output_compression: NoCompression
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.165359) EVENT_LOG_v1 {"time_micros": 1764398877165350, "job": 20, "event": "compaction_finished", "compaction_time_micros": 68480, "compaction_time_cpu_micros": 27997, "output_level": 6, "num_output_files": 1, "total_output_size": 8586310, "num_input_records": 5246, "num_output_records": 4812, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877165718, "job": 20, "event": "table_file_deletion", "file_number": 38}
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877167342, "job": 20, "event": "table_file_deletion", "file_number": 36}
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.095054) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.167443) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.167449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.167451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.167453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:57 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.167455) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:47:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:47:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:57.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:47:58 compute-1 ceph-mon[80754]: pgmap v1065: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:58.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:47:59 compute-1 ceph-mon[80754]: pgmap v1066: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:47:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:47:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:47:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:59.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:00 compute-1 podman[227656]: 2025-11-29 06:48:00.384264309 +0000 UTC m=+0.113395574 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:48:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:00.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:01 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:01.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:02 compute-1 ceph-mon[80754]: pgmap v1067: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:02.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:03 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/4080925924' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:48:03 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/4080925924' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:48:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:03.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:04 compute-1 podman[227682]: 2025-11-29 06:48:04.322991971 +0000 UTC m=+0.066906101 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 06:48:04 compute-1 podman[227683]: 2025-11-29 06:48:04.333691797 +0000 UTC m=+0.073913109 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:48:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:04.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:05 compute-1 ceph-mon[80754]: pgmap v1068: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:05.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:06 compute-1 nova_compute[225815]: 2025-11-29 06:48:06.076 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 16.51 sec
Nov 29 06:48:06 compute-1 ceph-mon[80754]: pgmap v1069: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:06.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:07 compute-1 ceph-mon[80754]: pgmap v1070: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:07.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:08.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:09.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:10 compute-1 ceph-mon[80754]: pgmap v1071: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:10.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:11 compute-1 ceph-mon[80754]: pgmap v1072: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:11 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:11.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:12.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:13.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:13 compute-1 nova_compute[225815]: 2025-11-29 06:48:13.967 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:13 compute-1 nova_compute[225815]: 2025-11-29 06:48:13.968 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 06:48:14 compute-1 ceph-mon[80754]: pgmap v1073: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:14.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:15.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:48:15.917 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:48:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:48:15.918 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:48:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:48:15.918 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:16 compute-1 ceph-mon[80754]: pgmap v1074: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:16.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:17 compute-1 ceph-mon[80754]: pgmap v1075: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:17.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:18 compute-1 nova_compute[225815]: 2025-11-29 06:48:18.021 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 1.94 sec
Nov 29 06:48:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:18.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:19 compute-1 ceph-mon[80754]: pgmap v1076: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:19.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:20.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:21.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:22 compute-1 ceph-mon[80754]: pgmap v1077: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:22.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:23.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:24 compute-1 ceph-mon[80754]: pgmap v1078: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:24.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:25 compute-1 ceph-mon[80754]: pgmap v1079: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:25.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:26.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:27.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:28 compute-1 ceph-mon[80754]: pgmap v1080: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:28.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:28 compute-1 nova_compute[225815]: 2025-11-29 06:48:28.745 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 06:48:28 compute-1 nova_compute[225815]: 2025-11-29 06:48:28.749 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:28 compute-1 nova_compute[225815]: 2025-11-29 06:48:28.749 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 06:48:29 compute-1 ceph-mon[80754]: pgmap v1081: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:29.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:30.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:31 compute-1 podman[227719]: 2025-11-29 06:48:31.369429556 +0000 UTC m=+0.103866609 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:48:31 compute-1 ceph-mon[80754]: pgmap v1082: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:31.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:32.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:33.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:34 compute-1 ceph-mon[80754]: pgmap v1083: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:34.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:35 compute-1 podman[227747]: 2025-11-29 06:48:35.34526085 +0000 UTC m=+0.078117861 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 06:48:35 compute-1 podman[227748]: 2025-11-29 06:48:35.345394064 +0000 UTC m=+0.078414039 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 06:48:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:35.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:36 compute-1 ceph-mon[80754]: pgmap v1084: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:48:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 6783 writes, 26K keys, 6783 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 6783 writes, 1386 syncs, 4.89 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 499 writes, 770 keys, 499 commit groups, 1.0 writes per commit group, ingest: 0.25 MB, 0.00 MB/s
                                           Interval WAL: 499 writes, 242 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 06:48:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:36.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:37.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:38 compute-1 ceph-mon[80754]: pgmap v1085: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:38 compute-1 sshd-session[227787]: Invalid user desliga from 93.157.248.178 port 36972
Nov 29 06:48:38 compute-1 sshd-session[227787]: Received disconnect from 93.157.248.178 port 36972:11: Bye Bye [preauth]
Nov 29 06:48:38 compute-1 sshd-session[227787]: Disconnected from invalid user desliga 93.157.248.178 port 36972 [preauth]
Nov 29 06:48:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:38.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:39.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:40 compute-1 ceph-mon[80754]: pgmap v1086: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:40.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:41 compute-1 nova_compute[225815]: 2025-11-29 06:48:41.775 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 13.75 sec
Nov 29 06:48:41 compute-1 ceph-mon[80754]: pgmap v1087: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:41.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:42.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:42 compute-1 sudo[227789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:42 compute-1 sudo[227789]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:42 compute-1 sudo[227789]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:42 compute-1 sudo[227814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:48:42 compute-1 sudo[227814]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:42 compute-1 sudo[227814]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:42 compute-1 sudo[227839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:42 compute-1 sudo[227839]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:42 compute-1 sudo[227839]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:43 compute-1 sudo[227864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:48:43 compute-1 sudo[227864]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:43 compute-1 ceph-mon[80754]: pgmap v1088: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:43.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:44 compute-1 podman[227961]: 2025-11-29 06:48:44.104339756 +0000 UTC m=+0.603869623 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default)
Nov 29 06:48:44 compute-1 podman[227961]: 2025-11-29 06:48:44.202832961 +0000 UTC m=+0.702362728 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:48:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:44.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:44 compute-1 sudo[227864]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:44 compute-1 sudo[228084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:44 compute-1 sudo[228084]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:44 compute-1 sudo[228084]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:44 compute-1 sudo[228109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:48:44 compute-1 sudo[228109]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:44 compute-1 sudo[228109]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:44 compute-1 sudo[228134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:44 compute-1 sudo[228134]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:44 compute-1 sudo[228134]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:45 compute-1 sudo[228159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:48:45 compute-1 sudo[228159]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:45 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:45 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:45 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:45 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.078632) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925078713, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 711, "num_deletes": 251, "total_data_size": 1330634, "memory_usage": 1351040, "flush_reason": "Manual Compaction"}
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925110707, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 878455, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20354, "largest_seqno": 21060, "table_properties": {"data_size": 874937, "index_size": 1426, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7933, "raw_average_key_size": 19, "raw_value_size": 867897, "raw_average_value_size": 2121, "num_data_blocks": 62, "num_entries": 409, "num_filter_entries": 409, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398878, "oldest_key_time": 1764398878, "file_creation_time": 1764398925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 32130 microseconds, and 4359 cpu microseconds.
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.110772) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 878455 bytes OK
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.110806) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.116419) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.116438) EVENT_LOG_v1 {"time_micros": 1764398925116432, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.116456) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 1326846, prev total WAL file size 1326846, number of live WAL files 2.
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.117092) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(857KB)], [39(8385KB)]
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925117145, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 9464765, "oldest_snapshot_seqno": -1}
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 4703 keys, 7370713 bytes, temperature: kUnknown
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925158917, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 7370713, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7340287, "index_size": 17580, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11781, "raw_key_size": 119554, "raw_average_key_size": 25, "raw_value_size": 7255961, "raw_average_value_size": 1542, "num_data_blocks": 714, "num_entries": 4703, "num_filter_entries": 4703, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.159430) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 7370713 bytes
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.161058) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 225.1 rd, 175.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 8.2 +0.0 blob) out(7.0 +0.0 blob), read-write-amplify(19.2) write-amplify(8.4) OK, records in: 5221, records dropped: 518 output_compression: NoCompression
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.161078) EVENT_LOG_v1 {"time_micros": 1764398925161068, "job": 22, "event": "compaction_finished", "compaction_time_micros": 42039, "compaction_time_cpu_micros": 17607, "output_level": 6, "num_output_files": 1, "total_output_size": 7370713, "num_input_records": 5221, "num_output_records": 4703, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925162017, "job": 22, "event": "table_file_deletion", "file_number": 41}
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925164076, "job": 22, "event": "table_file_deletion", "file_number": 39}
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.116966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.164380) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.164386) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.164388) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.164390) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.164391) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:48:45 compute-1 sudo[228159]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:45 compute-1 sudo[228214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:45 compute-1 sudo[228214]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:45 compute-1 sudo[228214]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:45 compute-1 sudo[228239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:48:45 compute-1 sudo[228239]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:45 compute-1 sudo[228239]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:45 compute-1 sudo[228264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:45 compute-1 sudo[228264]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:45 compute-1 sudo[228264]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:45 compute-1 sudo[228289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 list-networks
Nov 29 06:48:45 compute-1 sudo[228289]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:45 compute-1 sudo[228289]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:45.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:46 compute-1 ceph-mon[80754]: pgmap v1089: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:46 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:46 compute-1 sudo[228330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:46 compute-1 sudo[228330]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:46 compute-1 sudo[228330]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:46 compute-1 sudo[228355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:48:46 compute-1 sudo[228355]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:46 compute-1 sudo[228355]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:46 compute-1 sudo[228380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:46 compute-1 sudo[228380]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:46 compute-1 sudo[228380]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:46 compute-1 sudo[228405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ceph-volume --fsid 336ec58c-893b-528f-a0c1-6ed1196bc047 -- inventory --format=json-pretty --filter-for-batch
Nov 29 06:48:46 compute-1 sudo[228405]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:46 compute-1 podman[228469]: 2025-11-29 06:48:46.561321511 +0000 UTC m=+0.028844703 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:48:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:46.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:46 compute-1 podman[228469]: 2025-11-29 06:48:46.961674681 +0000 UTC m=+0.429197793 container create d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:48:47 compute-1 systemd[1]: Started libpod-conmon-d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61.scope.
Nov 29 06:48:47 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:48:47 compute-1 podman[228469]: 2025-11-29 06:48:47.312955177 +0000 UTC m=+0.780478369 container init d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:48:47 compute-1 podman[228469]: 2025-11-29 06:48:47.326334175 +0000 UTC m=+0.793857317 container start d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True)
Nov 29 06:48:47 compute-1 bold_cohen[228485]: 167 167
Nov 29 06:48:47 compute-1 systemd[1]: libpod-d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61.scope: Deactivated successfully.
Nov 29 06:48:47 compute-1 podman[228469]: 2025-11-29 06:48:47.343548195 +0000 UTC m=+0.811071377 container attach d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Nov 29 06:48:47 compute-1 podman[228469]: 2025-11-29 06:48:47.344182343 +0000 UTC m=+0.811705475 container died d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:48:47 compute-1 systemd[1]: var-lib-containers-storage-overlay-8113f39255a3fa0d837a24c602ef032e8f0e2d9c3f4a185599c6ed8ab4a68107-merged.mount: Deactivated successfully.
Nov 29 06:48:47 compute-1 ceph-mon[80754]: pgmap v1090: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:47 compute-1 podman[228469]: 2025-11-29 06:48:47.831567951 +0000 UTC m=+1.299091093 container remove d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Nov 29 06:48:47 compute-1 systemd[1]: libpod-conmon-d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61.scope: Deactivated successfully.
Nov 29 06:48:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:47.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:48 compute-1 podman[228509]: 2025-11-29 06:48:48.003004456 +0000 UTC m=+0.042176709 container create f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3)
Nov 29 06:48:48 compute-1 systemd[1]: Started libpod-conmon-f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096.scope.
Nov 29 06:48:48 compute-1 podman[228509]: 2025-11-29 06:48:47.984640615 +0000 UTC m=+0.023812698 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 06:48:48 compute-1 systemd[1]: Started libcrun container.
Nov 29 06:48:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18aa92f6b0e2733f1717726aed8428950bd2876963bc14bef06892dfbce88a20/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 06:48:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18aa92f6b0e2733f1717726aed8428950bd2876963bc14bef06892dfbce88a20/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 06:48:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18aa92f6b0e2733f1717726aed8428950bd2876963bc14bef06892dfbce88a20/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 06:48:48 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18aa92f6b0e2733f1717726aed8428950bd2876963bc14bef06892dfbce88a20/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 06:48:48 compute-1 podman[228509]: 2025-11-29 06:48:48.122977775 +0000 UTC m=+0.162149868 container init f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 06:48:48 compute-1 podman[228509]: 2025-11-29 06:48:48.128907054 +0000 UTC m=+0.168079117 container start f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 06:48:48 compute-1 podman[228509]: 2025-11-29 06:48:48.162956196 +0000 UTC m=+0.202128259 container attach f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Nov 29 06:48:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:48.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:49 compute-1 strange_cannon[228525]: [
Nov 29 06:48:49 compute-1 strange_cannon[228525]:     {
Nov 29 06:48:49 compute-1 strange_cannon[228525]:         "available": false,
Nov 29 06:48:49 compute-1 strange_cannon[228525]:         "ceph_device": false,
Nov 29 06:48:49 compute-1 strange_cannon[228525]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:         "lsm_data": {},
Nov 29 06:48:49 compute-1 strange_cannon[228525]:         "lvs": [],
Nov 29 06:48:49 compute-1 strange_cannon[228525]:         "path": "/dev/sr0",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:         "rejected_reasons": [
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "Has a FileSystem",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "Insufficient space (<5GB)"
Nov 29 06:48:49 compute-1 strange_cannon[228525]:         ],
Nov 29 06:48:49 compute-1 strange_cannon[228525]:         "sys_api": {
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "actuators": null,
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "device_nodes": "sr0",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "devname": "sr0",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "human_readable_size": "482.00 KB",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "id_bus": "ata",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "model": "QEMU DVD-ROM",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "nr_requests": "2",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "parent": "/dev/sr0",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "partitions": {},
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "path": "/dev/sr0",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "removable": "1",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "rev": "2.5+",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "ro": "0",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "rotational": "1",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "sas_address": "",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "sas_device_handle": "",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "scheduler_mode": "mq-deadline",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "sectors": 0,
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "sectorsize": "2048",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "size": 493568.0,
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "support_discard": "2048",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "type": "disk",
Nov 29 06:48:49 compute-1 strange_cannon[228525]:             "vendor": "QEMU"
Nov 29 06:48:49 compute-1 strange_cannon[228525]:         }
Nov 29 06:48:49 compute-1 strange_cannon[228525]:     }
Nov 29 06:48:49 compute-1 strange_cannon[228525]: ]
Nov 29 06:48:49 compute-1 systemd[1]: libpod-f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096.scope: Deactivated successfully.
Nov 29 06:48:49 compute-1 systemd[1]: libpod-f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096.scope: Consumed 1.272s CPU time.
Nov 29 06:48:49 compute-1 podman[228509]: 2025-11-29 06:48:49.375792839 +0000 UTC m=+1.414964932 container died f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 06:48:49 compute-1 systemd[1]: var-lib-containers-storage-overlay-18aa92f6b0e2733f1717726aed8428950bd2876963bc14bef06892dfbce88a20-merged.mount: Deactivated successfully.
Nov 29 06:48:49 compute-1 podman[228509]: 2025-11-29 06:48:49.537439492 +0000 UTC m=+1.576611555 container remove f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 06:48:49 compute-1 systemd[1]: libpod-conmon-f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096.scope: Deactivated successfully.
Nov 29 06:48:49 compute-1 sudo[228405]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:49.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:50 compute-1 ceph-mon[80754]: pgmap v1091: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:50 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:50 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:50 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:48:50 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:48:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:50.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:51 compute-1 nova_compute[225815]: 2025-11-29 06:48:51.093 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:48:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:48:51 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:48:51 compute-1 ceph-mon[80754]: pgmap v1092: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:51.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:52.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:53.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:48:54 compute-1 ceph-mon[80754]: pgmap v1093: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:54.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:55 compute-1 nova_compute[225815]: 2025-11-29 06:48:55.883 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 4.11 sec
Nov 29 06:48:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:55.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:56 compute-1 ceph-mon[80754]: pgmap v1094: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:56.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:48:57 compute-1 ceph-mon[80754]: pgmap v1095: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:57.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:48:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:58.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:48:58 compute-1 sudo[229740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:48:58 compute-1 sudo[229740]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:58 compute-1 sudo[229740]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:58 compute-1 sudo[229765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:48:58 compute-1 sudo[229765]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:48:58 compute-1 sudo[229765]: pam_unix(sudo:session): session closed for user root
Nov 29 06:48:59 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:59 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:48:59 compute-1 ceph-mon[80754]: pgmap v1096: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:48:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:48:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:48:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:59.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:00.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:01 compute-1 ceph-mon[80754]: pgmap v1097: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:01 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:01.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:02 compute-1 podman[229790]: 2025-11-29 06:49:02.424034041 +0000 UTC m=+0.156542589 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 06:49:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:49:02 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4223701543' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:49:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:49:02 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4223701543' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:49:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:02.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:02 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/4223701543' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:49:02 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/4223701543' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:49:02 compute-1 nova_compute[225815]: 2025-11-29 06:49:02.831 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:02 compute-1 nova_compute[225815]: 2025-11-29 06:49:02.831 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:02 compute-1 nova_compute[225815]: 2025-11-29 06:49:02.832 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:49:02 compute-1 nova_compute[225815]: 2025-11-29 06:49:02.832 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:49:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:03.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:03 compute-1 ceph-mon[80754]: pgmap v1098: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:04.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:05.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:06 compute-1 ceph-mon[80754]: pgmap v1099: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:06 compute-1 podman[229818]: 2025-11-29 06:49:06.322038134 +0000 UTC m=+0.066257423 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 06:49:06 compute-1 podman[229817]: 2025-11-29 06:49:06.341143945 +0000 UTC m=+0.079954239 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:49:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:06.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:07.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:08.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:09 compute-1 ceph-mon[80754]: pgmap v1100: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:09.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:10 compute-1 ceph-mon[80754]: pgmap v1101: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:10.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:11 compute-1 ceph-mon[80754]: pgmap v1102: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:11 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:12.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:12.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:14.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:14 compute-1 ceph-mon[80754]: pgmap v1103: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:14.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:15 compute-1 ceph-mon[80754]: pgmap v1104: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:49:15.918 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:49:15.919 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:49:15.919 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:16.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:16.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:18.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:18 compute-1 ceph-mon[80754]: pgmap v1105: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:18.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 06:49:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:20.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 06:49:20 compute-1 ceph-mon[80754]: pgmap v1106: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:20.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:21 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:49:21 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 3866 writes, 21K keys, 3866 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s
                                           Cumulative WAL: 3866 writes, 3866 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1409 writes, 7247 keys, 1409 commit groups, 1.0 writes per commit group, ingest: 14.93 MB, 0.02 MB/s
                                           Interval WAL: 1409 writes, 1409 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     50.1      0.52              0.09        11    0.048       0      0       0.0       0.0
                                             L6      1/0    7.03 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    108.3     89.4      1.02              0.28        10    0.102     49K   5239       0.0       0.0
                                            Sum      1/0    7.03 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     71.4     76.0      1.54              0.38        21    0.073     49K   5239       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.7    111.6    106.3      0.58              0.19        12    0.049     31K   3467       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    108.3     89.4      1.02              0.28        10    0.102     49K   5239       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     63.0      0.42              0.09        10    0.042       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.026, interval 0.009
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.11 GB write, 0.06 MB/s write, 0.11 GB read, 0.06 MB/s read, 1.5 seconds
                                           Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.6 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x562155f711f0#2 capacity: 304.00 MB usage: 8.06 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(419,7.66 MB,2.51979%) FilterBlock(21,140.86 KB,0.0452493%) IndexBlock(21,272.67 KB,0.0875925%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Nov 29 06:49:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:22.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:22 compute-1 ceph-mon[80754]: pgmap v1107: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:49:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:22.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:49:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:24.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:24 compute-1 nova_compute[225815]: 2025-11-29 06:49:24.150 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 18.27 sec
Nov 29 06:49:24 compute-1 ceph-mon[80754]: pgmap v1108: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:24.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:26.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:26.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:27 compute-1 ceph-mon[80754]: pgmap v1109: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:28.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:28.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:28 compute-1 ceph-mon[80754]: pgmap v1110: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:29 compute-1 ceph-mon[80754]: pgmap v1111: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:30.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:30.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:32.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:32 compute-1 ceph-mon[80754]: pgmap v1112: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:32.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:33 compute-1 podman[229856]: 2025-11-29 06:49:33.375084259 +0000 UTC m=+0.102277167 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 06:49:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:34.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:34 compute-1 nova_compute[225815]: 2025-11-29 06:49:34.164 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:49:34 compute-1 nova_compute[225815]: 2025-11-29 06:49:34.165 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-1 nova_compute[225815]: 2025-11-29 06:49:34.165 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-1 nova_compute[225815]: 2025-11-29 06:49:34.165 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-1 nova_compute[225815]: 2025-11-29 06:49:34.165 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-1 nova_compute[225815]: 2025-11-29 06:49:34.166 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-1 nova_compute[225815]: 2025-11-29 06:49:34.166 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-1 nova_compute[225815]: 2025-11-29 06:49:34.166 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:49:34 compute-1 nova_compute[225815]: 2025-11-29 06:49:34.166 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:34 compute-1 ceph-mon[80754]: pgmap v1113: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:34.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:35 compute-1 ceph-mon[80754]: pgmap v1114: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:36.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:36.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:37 compute-1 podman[229884]: 2025-11-29 06:49:37.313772809 +0000 UTC m=+0.051551370 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:49:37 compute-1 nova_compute[225815]: 2025-11-29 06:49:37.336 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.18 sec
Nov 29 06:49:37 compute-1 podman[229883]: 2025-11-29 06:49:37.341197282 +0000 UTC m=+0.083537495 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:49:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:38.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:38 compute-1 ceph-mon[80754]: pgmap v1115: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:38.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:40 compute-1 ceph-mon[80754]: pgmap v1116: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:40.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:40.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:42 compute-1 ceph-mon[80754]: pgmap v1117: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:42.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:42.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:43 compute-1 ceph-mon[80754]: pgmap v1118: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:44.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:44.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:46.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:46 compute-1 ceph-mon[80754]: pgmap v1119: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:46.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:47 compute-1 ceph-mon[80754]: pgmap v1120: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:48.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:48 compute-1 sshd-session[229922]: Received disconnect from 93.157.248.178 port 49462:11: Bye Bye [preauth]
Nov 29 06:49:48 compute-1 sshd-session[229922]: Disconnected from authenticating user root 93.157.248.178 port 49462 [preauth]
Nov 29 06:49:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:48.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:50 compute-1 ceph-mon[80754]: pgmap v1121: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:50.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:50.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:51 compute-1 nova_compute[225815]: 2025-11-29 06:49:51.117 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:51 compute-1 nova_compute[225815]: 2025-11-29 06:49:51.117 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:51 compute-1 nova_compute[225815]: 2025-11-29 06:49:51.117 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:51 compute-1 nova_compute[225815]: 2025-11-29 06:49:51.118 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:49:51 compute-1 nova_compute[225815]: 2025-11-29 06:49:51.118 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:49:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:49:51 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1398036893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:49:51 compute-1 nova_compute[225815]: 2025-11-29 06:49:51.573 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:49:51 compute-1 nova_compute[225815]: 2025-11-29 06:49:51.766 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:49:51 compute-1 nova_compute[225815]: 2025-11-29 06:49:51.767 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5359MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:49:51 compute-1 nova_compute[225815]: 2025-11-29 06:49:51.767 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:51 compute-1 nova_compute[225815]: 2025-11-29 06:49:51.768 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:52.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:52.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:53 compute-1 ceph-mon[80754]: pgmap v1122: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:53 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2261730173' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:49:53 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1398036893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:49:53 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1336302150' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:49:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:54.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:54 compute-1 ceph-mon[80754]: pgmap v1123: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:54.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:55 compute-1 ceph-mgr[81116]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 06:49:56 compute-1 ceph-mon[80754]: pgmap v1124: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:56.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:49:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:56.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:49:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:49:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:58.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:58 compute-1 ceph-mon[80754]: pgmap v1125: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:49:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:49:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:58.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:49:59 compute-1 sudo[229946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:59 compute-1 sudo[229946]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:59 compute-1 sudo[229946]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:59 compute-1 sudo[229971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:49:59 compute-1 sudo[229971]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:59 compute-1 sudo[229971]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:59 compute-1 sudo[229996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:59 compute-1 sudo[229996]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:59 compute-1 sudo[229996]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:59 compute-1 sudo[230021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 check-host
Nov 29 06:49:59 compute-1 sudo[230021]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:59 compute-1 ceph-mon[80754]: pgmap v1126: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:49:59 compute-1 sudo[230021]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:59 compute-1 sudo[230066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:59 compute-1 sudo[230066]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:59 compute-1 sudo[230066]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:59 compute-1 sudo[230091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:49:59 compute-1 sudo[230091]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:59 compute-1 sudo[230091]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:59 compute-1 sudo[230116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:49:59 compute-1 sudo[230116]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:49:59 compute-1 sudo[230116]: pam_unix(sudo:session): session closed for user root
Nov 29 06:49:59 compute-1 sudo[230141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:49:59 compute-1 sudo[230141]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:50:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:00.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:00 compute-1 sudo[230141]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:00 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:00 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:00 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:00 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:00 compute-1 ceph-mon[80754]: overall HEALTH_OK
Nov 29 06:50:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:00.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:01 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:02.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:02 compute-1 ceph-mon[80754]: pgmap v1127: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:02.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:04.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:04 compute-1 podman[230197]: 2025-11-29 06:50:04.341609558 +0000 UTC m=+0.086811703 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:50:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:04.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/2319096117' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:50:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/2319096117' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:50:04 compute-1 ceph-mon[80754]: pgmap v1128: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:06.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:06 compute-1 ceph-mon[80754]: pgmap v1129: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:06.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:06 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:50:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:50:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:50:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:50:07 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:50:07 compute-1 ceph-mon[80754]: pgmap v1130: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:08.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:08 compute-1 podman[230224]: 2025-11-29 06:50:08.307752732 +0000 UTC m=+0.048351574 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 06:50:08 compute-1 podman[230223]: 2025-11-29 06:50:08.345959314 +0000 UTC m=+0.089465794 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 06:50:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:08.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:09 compute-1 ceph-mon[80754]: pgmap v1131: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:10.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:10.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:11 compute-1 ceph-mon[80754]: pgmap v1132: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:11 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:12.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:12.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:14.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:14 compute-1 ceph-mon[80754]: pgmap v1133: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:14.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:50:15.919 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:50:15.919 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:50:15.919 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:15 compute-1 ceph-mon[80754]: pgmap v1134: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:16.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:16.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:17 compute-1 ceph-mon[80754]: pgmap v1135: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:18.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:18 compute-1 sudo[230262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:50:18 compute-1 sudo[230262]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:50:18 compute-1 sudo[230262]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:18 compute-1 sudo[230287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:50:18 compute-1 sudo[230287]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:50:18 compute-1 sudo[230287]: pam_unix(sudo:session): session closed for user root
Nov 29 06:50:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:18.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:19 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:50:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:20.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:20 compute-1 nova_compute[225815]: 2025-11-29 06:50:20.615 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 33.28 sec
Nov 29 06:50:20 compute-1 ceph-mon[80754]: pgmap v1136: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:20.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:21 compute-1 ceph-mon[80754]: pgmap v1137: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:21 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:22.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:22.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:24.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:24 compute-1 ceph-mon[80754]: pgmap v1138: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:24.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:26.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:26 compute-1 ceph-mon[80754]: pgmap v1139: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:26.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:27 compute-1 ceph-mon[80754]: pgmap v1140: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:28.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:28.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:30.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:30 compute-1 ceph-mon[80754]: pgmap v1141: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:30.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:31 compute-1 ceph-mon[80754]: pgmap v1142: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:32.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:32.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:33 compute-1 ceph-mon[80754]: pgmap v1143: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:34.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:34.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:35 compute-1 podman[230312]: 2025-11-29 06:50:35.370057203 +0000 UTC m=+0.102324638 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:50:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:36.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:36 compute-1 ceph-mon[80754]: pgmap v1144: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:36.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:37 compute-1 ceph-mon[80754]: pgmap v1145: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:38.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:38.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:39 compute-1 podman[230338]: 2025-11-29 06:50:39.31981361 +0000 UTC m=+0.064070315 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:50:39 compute-1 podman[230339]: 2025-11-29 06:50:39.331144083 +0000 UTC m=+0.067032854 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:50:39 compute-1 nova_compute[225815]: 2025-11-29 06:50:39.592 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:50:39 compute-1 nova_compute[225815]: 2025-11-29 06:50:39.594 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:50:39 compute-1 nova_compute[225815]: 2025-11-29 06:50:39.747 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing inventories for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 06:50:39 compute-1 nova_compute[225815]: 2025-11-29 06:50:39.773 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Updating ProviderTree inventory for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 06:50:39 compute-1 nova_compute[225815]: 2025-11-29 06:50:39.773 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Updating inventory in ProviderTree for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:50:39 compute-1 nova_compute[225815]: 2025-11-29 06:50:39.790 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing aggregate associations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 06:50:39 compute-1 nova_compute[225815]: 2025-11-29 06:50:39.822 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing trait associations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSSE3,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 06:50:39 compute-1 nova_compute[225815]: 2025-11-29 06:50:39.921 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:40.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:40 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:50:40 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4231777605' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:50:40 compute-1 ceph-mon[80754]: pgmap v1146: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:40 compute-1 nova_compute[225815]: 2025-11-29 06:50:40.363 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:40 compute-1 nova_compute[225815]: 2025-11-29 06:50:40.370 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:50:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:40.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:41 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3008365598' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:50:41 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/3662469046' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:50:41 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/4231777605' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:50:41 compute-1 ceph-mon[80754]: pgmap v1147: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:41 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:42.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:42.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:44 compute-1 ceph-mon[80754]: pgmap v1148: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:44.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:44.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:45 compute-1 ceph-mon[80754]: pgmap v1149: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:46.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:46.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:48.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:48 compute-1 ceph-mon[80754]: pgmap v1150: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:48.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:50.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:50 compute-1 ceph-mon[80754]: pgmap v1151: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:50.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:51 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:52.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:52 compute-1 ceph-mon[80754]: pgmap v1152: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:52.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:53 compute-1 ceph-mon[80754]: pgmap v1153: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:53 compute-1 nova_compute[225815]: 2025-11-29 06:50:53.950 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 13.33 sec
Nov 29 06:50:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:50:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:54.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:50:54 compute-1 sshd-session[230401]: Invalid user ts3 from 93.157.248.178 port 42172
Nov 29 06:50:54 compute-1 sshd-session[230401]: Received disconnect from 93.157.248.178 port 42172:11: Bye Bye [preauth]
Nov 29 06:50:54 compute-1 sshd-session[230401]: Disconnected from invalid user ts3 93.157.248.178 port 42172 [preauth]
Nov 29 06:50:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:54.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:55 compute-1 ceph-mon[80754]: pgmap v1154: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:56.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:50:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:56.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:56 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:50:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:50:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:58.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:50:58 compute-1 ceph-mon[80754]: pgmap v1155: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:50:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:50:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:50:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:58.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:00.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:00 compute-1 ceph-mon[80754]: pgmap v1156: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:00.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:02 compute-1 ceph-mon[80754]: pgmap v1157: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:02.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:02 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Nov 29 06:51:02 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Nov 29 06:51:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:02 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Nov 29 06:51:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:02.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:03 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/3119268237' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:51:03 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/3119268237' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:51:03 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Nov 29 06:51:03 compute-1 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Nov 29 06:51:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:04.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:04 compute-1 ceph-mon[80754]: pgmap v1158: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:04.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:06.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:06 compute-1 podman[230403]: 2025-11-29 06:51:06.362251755 +0000 UTC m=+0.097543096 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:51:06 compute-1 ceph-mon[80754]: pgmap v1159: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 4.2 KiB/s rd, 0 B/s wr, 7 op/s
Nov 29 06:51:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:06.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:07 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:07 compute-1 ceph-mon[80754]: pgmap v1160: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 54 KiB/s rd, 0 B/s wr, 90 op/s
Nov 29 06:51:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:08.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:08.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:10 compute-1 ceph-mon[80754]: pgmap v1161: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 74 KiB/s rd, 0 B/s wr, 123 op/s
Nov 29 06:51:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:10.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:10 compute-1 nova_compute[225815]: 2025-11-29 06:51:10.202 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:51:10 compute-1 nova_compute[225815]: 2025-11-29 06:51:10.204 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:51:10 compute-1 nova_compute[225815]: 2025-11-29 06:51:10.204 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 78.436s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:10 compute-1 podman[230431]: 2025-11-29 06:51:10.324566774 +0000 UTC m=+0.057566199 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 06:51:10 compute-1 podman[230430]: 2025-11-29 06:51:10.348773355 +0000 UTC m=+0.086452056 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Nov 29 06:51:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:10.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:12.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:12 compute-1 ceph-mon[80754]: pgmap v1162: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 74 KiB/s rd, 0 B/s wr, 123 op/s
Nov 29 06:51:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:12.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:14.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:14 compute-1 ceph-mon[80754]: pgmap v1163: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 82 KiB/s rd, 0 B/s wr, 136 op/s
Nov 29 06:51:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:14.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:15 compute-1 ceph-mon[80754]: pgmap v1164: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 82 KiB/s rd, 0 B/s wr, 136 op/s
Nov 29 06:51:15 compute-1 nova_compute[225815]: 2025-11-29 06:51:15.572 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 11.62 sec
Nov 29 06:51:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:51:15.922 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:51:15.922 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:51:15.922 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:16.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:16.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:18 compute-1 ceph-mon[80754]: pgmap v1165: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 78 KiB/s rd, 0 B/s wr, 129 op/s
Nov 29 06:51:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:18.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:18 compute-1 sudo[230469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:18 compute-1 sudo[230469]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:18 compute-1 sudo[230469]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:18 compute-1 sudo[230494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:51:18 compute-1 sudo[230494]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:18 compute-1 sudo[230494]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:18 compute-1 sudo[230519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:18 compute-1 sudo[230519]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:18 compute-1 sudo[230519]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:18.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:18 compute-1 sudo[230544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:51:18 compute-1 sudo[230544]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:19 compute-1 sudo[230544]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:20.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:20 compute-1 ceph-mon[80754]: pgmap v1166: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 28 KiB/s rd, 0 B/s wr, 46 op/s
Nov 29 06:51:20 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:20 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:20 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 06:51:20 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 06:51:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:51:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:20.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:51:21 compute-1 ceph-mon[80754]: pgmap v1167: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 7.8 KiB/s rd, 0 B/s wr, 13 op/s
Nov 29 06:51:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:22.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:22 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:22.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:51:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:51:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:51:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:51:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:51:23 compute-1 ceph-mon[80754]: pgmap v1168: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail; 7.8 KiB/s rd, 0 B/s wr, 13 op/s
Nov 29 06:51:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:24.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:24.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:25 compute-1 nova_compute[225815]: 2025-11-29 06:51:25.334 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:25 compute-1 nova_compute[225815]: 2025-11-29 06:51:25.335 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:25 compute-1 sshd-session[230600]: Invalid user admin from 2.57.121.112 port 61427
Nov 29 06:51:25 compute-1 ceph-mon[80754]: pgmap v1169: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:26 compute-1 sshd-session[230600]: Received disconnect from 2.57.121.112 port 61427:11: Bye [preauth]
Nov 29 06:51:26 compute-1 sshd-session[230600]: Disconnected from invalid user admin 2.57.121.112 port 61427 [preauth]
Nov 29 06:51:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.003000079s ======
Nov 29 06:51:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:26.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000079s
Nov 29 06:51:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:26.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:28.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:28 compute-1 ceph-mon[80754]: pgmap v1170: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:28.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:29 compute-1 nova_compute[225815]: 2025-11-29 06:51:29.399 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:29 compute-1 nova_compute[225815]: 2025-11-29 06:51:29.400 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:51:29 compute-1 nova_compute[225815]: 2025-11-29 06:51:29.400 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:51:29 compute-1 ceph-mon[80754]: pgmap v1171: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:30 compute-1 nova_compute[225815]: 2025-11-29 06:51:30.196 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 4.62 sec
Nov 29 06:51:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:30.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:30 compute-1 nova_compute[225815]: 2025-11-29 06:51:30.498 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:51:30 compute-1 nova_compute[225815]: 2025-11-29 06:51:30.499 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-1 nova_compute[225815]: 2025-11-29 06:51:30.499 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-1 nova_compute[225815]: 2025-11-29 06:51:30.499 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-1 nova_compute[225815]: 2025-11-29 06:51:30.499 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-1 nova_compute[225815]: 2025-11-29 06:51:30.499 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-1 nova_compute[225815]: 2025-11-29 06:51:30.500 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-1 nova_compute[225815]: 2025-11-29 06:51:30.500 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:51:30 compute-1 nova_compute[225815]: 2025-11-29 06:51:30.500 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:30.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:31 compute-1 nova_compute[225815]: 2025-11-29 06:51:31.082 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:31 compute-1 nova_compute[225815]: 2025-11-29 06:51:31.082 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:31 compute-1 nova_compute[225815]: 2025-11-29 06:51:31.083 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:31 compute-1 nova_compute[225815]: 2025-11-29 06:51:31.083 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:51:31 compute-1 nova_compute[225815]: 2025-11-29 06:51:31.083 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:51:31 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:51:31 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/990427680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:31 compute-1 nova_compute[225815]: 2025-11-29 06:51:31.514 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:51:31 compute-1 nova_compute[225815]: 2025-11-29 06:51:31.688 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:51:31 compute-1 nova_compute[225815]: 2025-11-29 06:51:31.690 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5364MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:51:31 compute-1 nova_compute[225815]: 2025-11-29 06:51:31.690 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:31 compute-1 nova_compute[225815]: 2025-11-29 06:51:31.690 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:32.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:32 compute-1 ceph-mon[80754]: pgmap v1172: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:32 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/990427680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:32 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2305942895' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:32 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/763194160' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:32 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:32 compute-1 nova_compute[225815]: 2025-11-29 06:51:32.922 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:51:32 compute-1 nova_compute[225815]: 2025-11-29 06:51:32.922 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:51:32 compute-1 nova_compute[225815]: 2025-11-29 06:51:32.935 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:51:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:32.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:33 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:51:33 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3831186218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:33 compute-1 nova_compute[225815]: 2025-11-29 06:51:33.412 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:51:33 compute-1 nova_compute[225815]: 2025-11-29 06:51:33.417 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:51:33 compute-1 ceph-mon[80754]: pgmap v1173: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:34.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:34.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:35 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3831186218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:35 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/1735230033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:35 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/4133065214' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:51:36 compute-1 sudo[230648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:51:36 compute-1 sudo[230648]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:36 compute-1 sudo[230648]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:36 compute-1 sudo[230673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:51:36 compute-1 sudo[230673]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:51:36 compute-1 sudo[230673]: pam_unix(sudo:session): session closed for user root
Nov 29 06:51:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:36.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:36 compute-1 ceph-mon[80754]: pgmap v1174: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:36 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:51:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:36.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:37 compute-1 podman[230698]: 2025-11-29 06:51:37.360045791 +0000 UTC m=+0.105349485 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 06:51:37 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:37 compute-1 ceph-mon[80754]: pgmap v1175: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:38 compute-1 sshd-session[230646]: Invalid user wordpress from 66.94.122.234 port 50796
Nov 29 06:51:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:38.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:38 compute-1 sshd-session[230646]: Received disconnect from 66.94.122.234 port 50796:11: Bye Bye [preauth]
Nov 29 06:51:38 compute-1 sshd-session[230646]: Disconnected from invalid user wordpress 66.94.122.234 port 50796 [preauth]
Nov 29 06:51:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:38.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:40.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:40 compute-1 nova_compute[225815]: 2025-11-29 06:51:40.518 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:51:40 compute-1 nova_compute[225815]: 2025-11-29 06:51:40.521 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:51:40 compute-1 nova_compute[225815]: 2025-11-29 06:51:40.521 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 8.831s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:40 compute-1 ceph-mon[80754]: pgmap v1176: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:40.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:41 compute-1 podman[230724]: 2025-11-29 06:51:41.311061597 +0000 UTC m=+0.055989328 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 29 06:51:41 compute-1 podman[230725]: 2025-11-29 06:51:41.311080857 +0000 UTC m=+0.050999852 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 06:51:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:42.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:42 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:42 compute-1 ceph-mon[80754]: pgmap v1177: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:42.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:44.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:44 compute-1 ceph-mon[80754]: pgmap v1178: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:44.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:45 compute-1 ceph-mon[80754]: pgmap v1179: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:46.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:46.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:47 compute-1 ceph-mon[80754]: pgmap v1180: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:48.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:48.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:49 compute-1 ceph-mon[80754]: pgmap v1181: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:50.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:50.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:52.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:52 compute-1 ceph-mon[80754]: pgmap v1182: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:52.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:53 compute-1 ceph-mon[80754]: pgmap v1183: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:54.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:54.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:56 compute-1 nova_compute[225815]: 2025-11-29 06:51:56.038 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 5.84 sec
Nov 29 06:51:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:56.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:56 compute-1 ceph-mon[80754]: pgmap v1184: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:56.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:51:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:51:57 compute-1 ceph-mon[80754]: pgmap v1185: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:51:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:51:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:58.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:51:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:51:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:51:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:58.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:00 compute-1 sshd-session[230762]: Invalid user copia from 93.157.248.178 port 43112
Nov 29 06:52:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:00.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:00 compute-1 ceph-mon[80754]: pgmap v1186: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:00 compute-1 sshd-session[230762]: Received disconnect from 93.157.248.178 port 43112:11: Bye Bye [preauth]
Nov 29 06:52:00 compute-1 sshd-session[230762]: Disconnected from invalid user copia 93.157.248.178 port 43112 [preauth]
Nov 29 06:52:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:01.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:02.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:02 compute-1 ceph-mon[80754]: pgmap v1187: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:03.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/3126230656' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:52:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/3126230656' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:52:04 compute-1 ceph-mon[80754]: pgmap v1188: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:04.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:05.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:06 compute-1 ceph-mon[80754]: pgmap v1189: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:06.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:07.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:07 compute-1 ceph-mon[80754]: pgmap v1190: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:07 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:08.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:08 compute-1 podman[230764]: 2025-11-29 06:52:08.339795988 +0000 UTC m=+0.082214332 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:52:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:09.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:09 compute-1 ceph-mon[80754]: pgmap v1191: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:10.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:11.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:12 compute-1 ceph-mon[80754]: pgmap v1192: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:12 compute-1 podman[230792]: 2025-11-29 06:52:12.303546905 +0000 UTC m=+0.048667121 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 06:52:12 compute-1 podman[230791]: 2025-11-29 06:52:12.303588206 +0000 UTC m=+0.050524040 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 29 06:52:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:12.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:13.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:14.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:14 compute-1 ceph-mon[80754]: pgmap v1193: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:15.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:15 compute-1 ceph-mon[80754]: pgmap v1194: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:52:15.923 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:52:15.924 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:52:15.924 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:16.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:17.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:17 compute-1 ceph-mon[80754]: pgmap v1195: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:18.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:19.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:20 compute-1 ceph-mon[80754]: pgmap v1196: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:20.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:21.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:21 compute-1 ceph-mon[80754]: pgmap v1197: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:22.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:22 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:23.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:23 compute-1 ceph-mon[80754]: pgmap v1198: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:24.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:25.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:25 compute-1 ceph-mon[80754]: pgmap v1199: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:26.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:27.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.502251) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147502348, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 2358, "num_deletes": 251, "total_data_size": 6203043, "memory_usage": 6281200, "flush_reason": "Manual Compaction"}
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Nov 29 06:52:27 compute-1 ceph-mon[80754]: pgmap v1200: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147525251, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 4024087, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21065, "largest_seqno": 23418, "table_properties": {"data_size": 4014455, "index_size": 6126, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19000, "raw_average_key_size": 20, "raw_value_size": 3995455, "raw_average_value_size": 4214, "num_data_blocks": 274, "num_entries": 948, "num_filter_entries": 948, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398925, "oldest_key_time": 1764398925, "file_creation_time": 1764399147, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 23148 microseconds, and 8736 cpu microseconds.
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.525393) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 4024087 bytes OK
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.525422) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.527798) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.527816) EVENT_LOG_v1 {"time_micros": 1764399147527811, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.527841) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 6192706, prev total WAL file size 6192706, number of live WAL files 2.
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.529692) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(3929KB)], [42(7197KB)]
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147530191, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 11394800, "oldest_snapshot_seqno": -1}
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5132 keys, 9349278 bytes, temperature: kUnknown
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147597182, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 9349278, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9314499, "index_size": 20845, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12869, "raw_key_size": 128876, "raw_average_key_size": 25, "raw_value_size": 9221031, "raw_average_value_size": 1796, "num_data_blocks": 857, "num_entries": 5132, "num_filter_entries": 5132, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764399147, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.597599) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 9349278 bytes
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.599327) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 169.8 rd, 139.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 7.0 +0.0 blob) out(8.9 +0.0 blob), read-write-amplify(5.2) write-amplify(2.3) OK, records in: 5651, records dropped: 519 output_compression: NoCompression
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.599345) EVENT_LOG_v1 {"time_micros": 1764399147599335, "job": 24, "event": "compaction_finished", "compaction_time_micros": 67122, "compaction_time_cpu_micros": 24779, "output_level": 6, "num_output_files": 1, "total_output_size": 9349278, "num_input_records": 5651, "num_output_records": 5132, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147600522, "job": 24, "event": "table_file_deletion", "file_number": 44}
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147602091, "job": 24, "event": "table_file_deletion", "file_number": 42}
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.529531) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.602126) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.602131) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.602133) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.602135) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:27 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.602137) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:28.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:29.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.167133) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149167215, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 262, "num_deletes": 256, "total_data_size": 20532, "memory_usage": 27576, "flush_reason": "Manual Compaction"}
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149170060, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 13142, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23420, "largest_seqno": 23680, "table_properties": {"data_size": 11326, "index_size": 49, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4242, "raw_average_key_size": 16, "raw_value_size": 7889, "raw_average_value_size": 30, "num_data_blocks": 2, "num_entries": 261, "num_filter_entries": 261, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764399149, "oldest_key_time": 1764399149, "file_creation_time": 1764399149, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 2954 microseconds, and 1035 cpu microseconds.
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.170103) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 13142 bytes OK
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.170124) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.171581) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.171605) EVENT_LOG_v1 {"time_micros": 1764399149171600, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.171626) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 18466, prev total WAL file size 18466, number of live WAL files 2.
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.172389) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323534' seq:72057594037927935, type:22 .. '6C6F676D00353036' seq:0, type:0; will stop at (end)
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(12KB)], [45(9130KB)]
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149172429, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 9362420, "oldest_snapshot_seqno": -1}
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 4877 keys, 9228718 bytes, temperature: kUnknown
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149242967, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 9228718, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9195140, "index_size": 20284, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 124773, "raw_average_key_size": 25, "raw_value_size": 9105598, "raw_average_value_size": 1867, "num_data_blocks": 828, "num_entries": 4877, "num_filter_entries": 4877, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764399149, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.243218) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 9228718 bytes
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.245018) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 132.6 rd, 130.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 8.9 +0.0 blob) out(8.8 +0.0 blob), read-write-amplify(1414.6) write-amplify(702.2) OK, records in: 5393, records dropped: 516 output_compression: NoCompression
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.245037) EVENT_LOG_v1 {"time_micros": 1764399149245028, "job": 26, "event": "compaction_finished", "compaction_time_micros": 70618, "compaction_time_cpu_micros": 22970, "output_level": 6, "num_output_files": 1, "total_output_size": 9228718, "num_input_records": 5393, "num_output_records": 4877, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149245151, "job": 26, "event": "table_file_deletion", "file_number": 47}
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149246621, "job": 26, "event": "table_file_deletion", "file_number": 45}
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.171920) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.246685) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.246689) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.246691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.246692) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:29 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.246693) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:52:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:30.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:30 compute-1 ceph-mon[80754]: pgmap v1201: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:31.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:31 compute-1 ceph-mon[80754]: pgmap v1202: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:32.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:32 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:33.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:34.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:34 compute-1 ceph-mon[80754]: pgmap v1203: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:35.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:35 compute-1 ceph-mon[80754]: pgmap v1204: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:36.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:36 compute-1 sudo[230828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:36 compute-1 sudo[230828]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:36 compute-1 sudo[230828]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:36 compute-1 sudo[230853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:52:36 compute-1 sudo[230853]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:36 compute-1 sudo[230853]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:36 compute-1 sudo[230878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:36 compute-1 sudo[230878]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:36 compute-1 sudo[230878]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:36 compute-1 sudo[230903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:52:36 compute-1 sudo[230903]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:37.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:37 compute-1 sudo[230903]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:37 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:38.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:38 compute-1 ceph-mon[80754]: pgmap v1205: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:39.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:39 compute-1 podman[230959]: 2025-11-29 06:52:39.346172084 +0000 UTC m=+0.089148469 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:52:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:52:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:52:39 compute-1 ceph-mon[80754]: pgmap v1206: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 06:52:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:52:39 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:52:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:40.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:40 compute-1 nova_compute[225815]: 2025-11-29 06:52:40.523 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:40 compute-1 nova_compute[225815]: 2025-11-29 06:52:40.523 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:40 compute-1 nova_compute[225815]: 2025-11-29 06:52:40.524 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:52:40 compute-1 nova_compute[225815]: 2025-11-29 06:52:40.524 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:52:40 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:52:40 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:52:40 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:52:40 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:52:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:41.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:41 compute-1 ceph-mon[80754]: pgmap v1207: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:42.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:42 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:43.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:43 compute-1 podman[230986]: 2025-11-29 06:52:43.323716774 +0000 UTC m=+0.060746085 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 06:52:43 compute-1 podman[230985]: 2025-11-29 06:52:43.350262798 +0000 UTC m=+0.090207697 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 06:52:44 compute-1 ceph-mon[80754]: pgmap v1208: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:44.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:45.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:46 compute-1 ceph-mon[80754]: pgmap v1209: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:46.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:46 compute-1 sudo[231023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:52:46 compute-1 sudo[231023]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:46 compute-1 sudo[231023]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:46 compute-1 sudo[231048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:52:46 compute-1 sudo[231048]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:52:46 compute-1 sudo[231048]: pam_unix(sudo:session): session closed for user root
Nov 29 06:52:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:47.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:52:47 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:52:47 compute-1 ceph-mon[80754]: pgmap v1210: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:48.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:49.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:50 compute-1 ceph-mon[80754]: pgmap v1211: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:50.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:52:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:51.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:52:52 compute-1 ceph-mon[80754]: pgmap v1212: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:52.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:53.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:53 compute-1 ceph-mon[80754]: pgmap v1213: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:54.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:55.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:56 compute-1 ceph-mon[80754]: pgmap v1214: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:56.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:52:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:57.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:52:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:52:58 compute-1 ceph-mon[80754]: pgmap v1215: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:52:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:58.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:52:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:52:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:59.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:52:59 compute-1 ceph-mon[80754]: pgmap v1216: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:00.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:01.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:02 compute-1 ceph-mon[80754]: pgmap v1217: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:02.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:03.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:03 compute-1 ceph-mon[80754]: pgmap v1218: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:04.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:05.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:06 compute-1 sshd-session[231073]: Invalid user jrodrig from 93.157.248.178 port 44650
Nov 29 06:53:06 compute-1 ceph-mon[80754]: pgmap v1219: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:06 compute-1 sshd-session[231073]: Received disconnect from 93.157.248.178 port 44650:11: Bye Bye [preauth]
Nov 29 06:53:06 compute-1 sshd-session[231073]: Disconnected from invalid user jrodrig 93.157.248.178 port 44650 [preauth]
Nov 29 06:53:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:06.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:07.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:07 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:08 compute-1 ceph-mon[80754]: pgmap v1220: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:08.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:09.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:10 compute-1 podman[231075]: 2025-11-29 06:53:10.371160205 +0000 UTC m=+0.098103240 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:53:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:10.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:10 compute-1 ceph-mon[80754]: pgmap v1221: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:11.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:11 compute-1 ceph-mon[80754]: pgmap v1222: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:12.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:13.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:14 compute-1 podman[231102]: 2025-11-29 06:53:14.325376576 +0000 UTC m=+0.062002039 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 29 06:53:14 compute-1 podman[231101]: 2025-11-29 06:53:14.336390503 +0000 UTC m=+0.078995627 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 29 06:53:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:14.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:14 compute-1 ceph-mon[80754]: pgmap v1223: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:53:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:15.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:53:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:53:15.925 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:53:15.926 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:53:15.926 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:16 compute-1 ceph-mon[80754]: pgmap v1224: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:16.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:17.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:17 compute-1 nova_compute[225815]: 2025-11-29 06:53:17.514 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 51.48 sec
Nov 29 06:53:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:18.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:18 compute-1 nova_compute[225815]: 2025-11-29 06:53:18.449 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:53:18 compute-1 nova_compute[225815]: 2025-11-29 06:53:18.450 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-1 nova_compute[225815]: 2025-11-29 06:53:18.450 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-1 nova_compute[225815]: 2025-11-29 06:53:18.450 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-1 nova_compute[225815]: 2025-11-29 06:53:18.451 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-1 nova_compute[225815]: 2025-11-29 06:53:18.451 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-1 nova_compute[225815]: 2025-11-29 06:53:18.451 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:18 compute-1 nova_compute[225815]: 2025-11-29 06:53:18.451 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:53:18 compute-1 nova_compute[225815]: 2025-11-29 06:53:18.452 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:19.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:19 compute-1 ceph-mon[80754]: pgmap v1225: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:20.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:21 compute-1 ceph-mon[80754]: pgmap v1226: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:21.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:22 compute-1 sshd-session[231139]: Invalid user share from 66.94.122.234 port 58528
Nov 29 06:53:22 compute-1 ceph-mon[80754]: pgmap v1227: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:22 compute-1 sshd-session[231139]: Received disconnect from 66.94.122.234 port 58528:11: Bye Bye [preauth]
Nov 29 06:53:22 compute-1 sshd-session[231139]: Disconnected from invalid user share 66.94.122.234 port 58528 [preauth]
Nov 29 06:53:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:22.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:22 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:23.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:23 compute-1 nova_compute[225815]: 2025-11-29 06:53:23.216 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:23 compute-1 nova_compute[225815]: 2025-11-29 06:53:23.217 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:23 compute-1 nova_compute[225815]: 2025-11-29 06:53:23.217 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:23 compute-1 nova_compute[225815]: 2025-11-29 06:53:23.218 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:53:23 compute-1 nova_compute[225815]: 2025-11-29 06:53:23.218 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:53:23 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:53:23 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/586851354' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:23 compute-1 nova_compute[225815]: 2025-11-29 06:53:23.654 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:53:23 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:53:23 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2060863307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:23 compute-1 nova_compute[225815]: 2025-11-29 06:53:23.850 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:53:23 compute-1 nova_compute[225815]: 2025-11-29 06:53:23.851 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5353MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:53:23 compute-1 nova_compute[225815]: 2025-11-29 06:53:23.852 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:23 compute-1 nova_compute[225815]: 2025-11-29 06:53:23.852 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:24 compute-1 ceph-mon[80754]: pgmap v1228: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:24.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:25.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:26 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2185145600' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:26 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/586851354' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:26 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2060863307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:26 compute-1 ceph-mon[80754]: pgmap v1229: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:26.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:27.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:27 compute-1 ceph-mon[80754]: pgmap v1230: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:28.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:29.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:29 compute-1 ceph-mon[80754]: pgmap v1231: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:30.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:53:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:31.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:53:32 compute-1 ceph-mon[80754]: pgmap v1232: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:32.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:32 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:33.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:33 compute-1 ceph-mon[80754]: pgmap v1233: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:53:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:34.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:53:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:35.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:36 compute-1 ceph-mon[80754]: pgmap v1234: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:36.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:37.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:37 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:38.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:38 compute-1 ceph-mon[80754]: pgmap v1235: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:38 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1865317670' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:53:38 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1865317670' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:53:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:39.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:40 compute-1 ceph-mon[80754]: pgmap v1236: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:40 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1260406766' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:53:40 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1260406766' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:53:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:40.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:41.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:41 compute-1 podman[231164]: 2025-11-29 06:53:41.329427559 +0000 UTC m=+0.115122569 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:53:41 compute-1 ceph-mon[80754]: pgmap v1237: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:42.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:42 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:43.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:44 compute-1 ceph-mon[80754]: pgmap v1238: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:44.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:45.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:45 compute-1 podman[231191]: 2025-11-29 06:53:45.323061329 +0000 UTC m=+0.065899854 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 06:53:45 compute-1 podman[231192]: 2025-11-29 06:53:45.347106496 +0000 UTC m=+0.083448767 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 06:53:45 compute-1 ceph-mon[80754]: pgmap v1239: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:46.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:46 compute-1 sudo[231229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:46 compute-1 sudo[231229]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:46 compute-1 sudo[231229]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:47 compute-1 sudo[231254]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:53:47 compute-1 sudo[231254]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:47 compute-1 sudo[231254]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:47 compute-1 sudo[231279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:47 compute-1 sudo[231279]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:47 compute-1 sudo[231279]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:47 compute-1 sudo[231304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:53:47 compute-1 sudo[231304]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:47.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:47 compute-1 ceph-mon[80754]: pgmap v1240: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:47 compute-1 sudo[231304]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:48.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:53:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:53:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:53:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:53:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:53:48 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:53:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:49.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:50 compute-1 ceph-mon[80754]: pgmap v1241: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:50.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:51.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:52 compute-1 nova_compute[225815]: 2025-11-29 06:53:52.104 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:53:52 compute-1 nova_compute[225815]: 2025-11-29 06:53:52.106 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:53:52 compute-1 nova_compute[225815]: 2025-11-29 06:53:52.129 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:53:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:52.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:53:52 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3803659993' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:52 compute-1 nova_compute[225815]: 2025-11-29 06:53:52.610 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:53:52 compute-1 nova_compute[225815]: 2025-11-29 06:53:52.621 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:53:52 compute-1 ceph-mon[80754]: pgmap v1242: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:53:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:53.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:53:54 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/3731896802' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:54 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3803659993' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:54 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/185560197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:53:54 compute-1 ceph-mon[80754]: pgmap v1243: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:54 compute-1 nova_compute[225815]: 2025-11-29 06:53:54.477 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:53:54 compute-1 nova_compute[225815]: 2025-11-29 06:53:54.480 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:53:54 compute-1 nova_compute[225815]: 2025-11-29 06:53:54.480 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 30.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:54 compute-1 nova_compute[225815]: 2025-11-29 06:53:54.481 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:54 compute-1 nova_compute[225815]: 2025-11-29 06:53:54.481 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 06:53:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:54.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:55.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:56.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:57 compute-1 ceph-mon[80754]: pgmap v1244: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:57.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:53:58 compute-1 ceph-mon[80754]: pgmap v1245: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:53:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:58.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:58 compute-1 sudo[231382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:53:58 compute-1 sudo[231382]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:58 compute-1 sudo[231382]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:59 compute-1 sudo[231407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:53:59 compute-1 sudo[231407]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:53:59 compute-1 sudo[231407]: pam_unix(sudo:session): session closed for user root
Nov 29 06:53:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:53:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:53:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:59.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:53:59 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:53:59 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:53:59 compute-1 ceph-mon[80754]: pgmap v1246: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000053s ======
Nov 29 06:54:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:00.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Nov 29 06:54:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:01.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:02 compute-1 ceph-mon[80754]: pgmap v1247: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:02.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:02 compute-1 nova_compute[225815]: 2025-11-29 06:54:02.644 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 15.12 sec
Nov 29 06:54:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:54:02 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/972693801' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:54:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:54:02 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/972693801' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:54:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:03 compute-1 nova_compute[225815]: 2025-11-29 06:54:03.050 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 06:54:03 compute-1 nova_compute[225815]: 2025-11-29 06:54:03.050 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:03 compute-1 nova_compute[225815]: 2025-11-29 06:54:03.051 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 06:54:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:03.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:03 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/972693801' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:54:03 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/972693801' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:54:04 compute-1 ceph-mon[80754]: pgmap v1248: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:04 compute-1 nova_compute[225815]: 2025-11-29 06:54:04.481 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:04.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:05.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:05 compute-1 ceph-mon[80754]: pgmap v1249: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:06.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:07.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:07 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:08 compute-1 ceph-mon[80754]: pgmap v1250: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:08.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:09.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:10 compute-1 ceph-mon[80754]: pgmap v1251: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:10.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:11.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:12 compute-1 ceph-mon[80754]: pgmap v1252: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:12 compute-1 podman[231432]: 2025-11-29 06:54:12.397366202 +0000 UTC m=+0.129576247 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 06:54:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:12.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:13.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:13 compute-1 ceph-mon[80754]: pgmap v1253: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:14.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:15 compute-1 sshd-session[231458]: Invalid user ventas01 from 93.157.248.178 port 36188
Nov 29 06:54:15 compute-1 sshd-session[231458]: Received disconnect from 93.157.248.178 port 36188:11: Bye Bye [preauth]
Nov 29 06:54:15 compute-1 sshd-session[231458]: Disconnected from invalid user ventas01 93.157.248.178 port 36188 [preauth]
Nov 29 06:54:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:15.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:54:15.926 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:54:15.927 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:54:15.927 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:16 compute-1 ceph-mon[80754]: pgmap v1254: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:16 compute-1 podman[231461]: 2025-11-29 06:54:16.325504711 +0000 UTC m=+0.063280653 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:54:16 compute-1 podman[231460]: 2025-11-29 06:54:16.356371711 +0000 UTC m=+0.097409541 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 29 06:54:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:16.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:17.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:17 compute-1 ceph-mon[80754]: pgmap v1255: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:18.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:19.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:19 compute-1 ceph-mon[80754]: pgmap v1256: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:20.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:21.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:22.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:22 compute-1 ceph-mon[80754]: pgmap v1257: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:22 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:23.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:23 compute-1 ceph-mon[80754]: pgmap v1258: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:24.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:25.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:25 compute-1 ceph-mon[80754]: pgmap v1259: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:54:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:26.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:54:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:27.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:28 compute-1 ceph-mon[80754]: pgmap v1260: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:28.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:29.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:29 compute-1 ceph-mon[80754]: pgmap v1261: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:29 compute-1 nova_compute[225815]: 2025-11-29 06:54:29.734 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:29 compute-1 nova_compute[225815]: 2025-11-29 06:54:29.735 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:30.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:31.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:31 compute-1 nova_compute[225815]: 2025-11-29 06:54:31.628 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 8.98 sec
Nov 29 06:54:32 compute-1 ceph-mon[80754]: pgmap v1262: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:32.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:32 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:33.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:33 compute-1 ceph-mon[80754]: pgmap v1263: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:34.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:35.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:35 compute-1 ceph-mon[80754]: pgmap v1264: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:36.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:36 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/3604148447' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:54:36 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/3604148447' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:54:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:37.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:37 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:38.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:39 compute-1 ceph-mon[80754]: pgmap v1265: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:39.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:39 compute-1 nova_compute[225815]: 2025-11-29 06:54:39.898 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:39 compute-1 nova_compute[225815]: 2025-11-29 06:54:39.898 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:54:39 compute-1 nova_compute[225815]: 2025-11-29 06:54:39.899 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:54:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:40.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:40 compute-1 ceph-mon[80754]: pgmap v1266: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:41.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:41 compute-1 ceph-mon[80754]: pgmap v1267: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:42.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:42 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:43.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:43 compute-1 podman[231498]: 2025-11-29 06:54:43.341413092 +0000 UTC m=+0.083156028 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 06:54:43 compute-1 ceph-mon[80754]: pgmap v1268: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:44.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:45.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:45 compute-1 ceph-mon[80754]: pgmap v1269: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:46.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:47.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:47 compute-1 podman[231525]: 2025-11-29 06:54:47.310993227 +0000 UTC m=+0.053662794 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 29 06:54:47 compute-1 podman[231526]: 2025-11-29 06:54:47.311004488 +0000 UTC m=+0.049112073 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:54:47 compute-1 nova_compute[225815]: 2025-11-29 06:54:47.793 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.16 sec
Nov 29 06:54:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:47 compute-1 ceph-mon[80754]: pgmap v1270: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:48.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:49.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:49 compute-1 nova_compute[225815]: 2025-11-29 06:54:49.360 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:54:49 compute-1 nova_compute[225815]: 2025-11-29 06:54:49.361 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:49 compute-1 nova_compute[225815]: 2025-11-29 06:54:49.361 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:49 compute-1 nova_compute[225815]: 2025-11-29 06:54:49.361 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:49 compute-1 nova_compute[225815]: 2025-11-29 06:54:49.361 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:49 compute-1 nova_compute[225815]: 2025-11-29 06:54:49.361 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:49 compute-1 nova_compute[225815]: 2025-11-29 06:54:49.362 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:49 compute-1 ceph-mon[80754]: pgmap v1271: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:50.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:51.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:52.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:53.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:53 compute-1 ceph-mon[80754]: pgmap v1272: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:54.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:54 compute-1 ceph-mon[80754]: pgmap v1273: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:54:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:55.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:54:56 compute-1 ceph-mon[80754]: pgmap v1274: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:56.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:57.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:54:57 compute-1 ceph-mon[80754]: pgmap v1275: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:54:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:58.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:59 compute-1 sudo[231564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:54:59 compute-1 sudo[231564]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:59 compute-1 sudo[231564]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:54:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:54:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:59.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:54:59 compute-1 sudo[231589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:54:59 compute-1 sudo[231589]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:59 compute-1 sudo[231589]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:59 compute-1 sudo[231614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:54:59 compute-1 sudo[231614]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:59 compute-1 sudo[231614]: pam_unix(sudo:session): session closed for user root
Nov 29 06:54:59 compute-1 sudo[231639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:54:59 compute-1 sudo[231639]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:54:59 compute-1 sudo[231639]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:00 compute-1 ceph-mon[80754]: pgmap v1276: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:00.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:01.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:01 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:55:01 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:55:01 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:55:01 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:55:01 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:55:01 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:55:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:02.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:03 compute-1 ceph-mon[80754]: pgmap v1277: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:03.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2982005399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:04 compute-1 ceph-mon[80754]: pgmap v1278: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:04.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:05.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:05 compute-1 ceph-mon[80754]: pgmap v1279: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:06.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:07.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:07 compute-1 sshd-session[231696]: Invalid user halo from 66.94.122.234 port 32910
Nov 29 06:55:07 compute-1 sshd-session[231696]: Received disconnect from 66.94.122.234 port 32910:11: Bye Bye [preauth]
Nov 29 06:55:07 compute-1 sshd-session[231696]: Disconnected from invalid user halo 66.94.122.234 port 32910 [preauth]
Nov 29 06:55:07 compute-1 ceph-mon[80754]: pgmap v1280: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:07 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:08.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:09.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:10 compute-1 ceph-mon[80754]: pgmap v1281: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:10.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:11.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:12.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:13 compute-1 ceph-mon[80754]: pgmap v1282: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:13 compute-1 nova_compute[225815]: 2025-11-29 06:55:13.291 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:13 compute-1 nova_compute[225815]: 2025-11-29 06:55:13.291 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:55:13 compute-1 nova_compute[225815]: 2025-11-29 06:55:13.291 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:13.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:13 compute-1 nova_compute[225815]: 2025-11-29 06:55:13.474 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 5.67 sec
Nov 29 06:55:14 compute-1 nova_compute[225815]: 2025-11-29 06:55:14.085 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:14 compute-1 nova_compute[225815]: 2025-11-29 06:55:14.086 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:14 compute-1 nova_compute[225815]: 2025-11-29 06:55:14.086 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:14 compute-1 nova_compute[225815]: 2025-11-29 06:55:14.086 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:55:14 compute-1 nova_compute[225815]: 2025-11-29 06:55:14.087 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:14 compute-1 podman[231718]: 2025-11-29 06:55:14.39037315 +0000 UTC m=+0.124905681 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:55:14 compute-1 ceph-mon[80754]: pgmap v1283: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:14 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:55:14 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2860548019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:14 compute-1 nova_compute[225815]: 2025-11-29 06:55:14.545 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:14.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:14 compute-1 nova_compute[225815]: 2025-11-29 06:55:14.719 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:55:14 compute-1 nova_compute[225815]: 2025-11-29 06:55:14.720 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5322MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:55:14 compute-1 nova_compute[225815]: 2025-11-29 06:55:14.720 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:14 compute-1 nova_compute[225815]: 2025-11-29 06:55:14.720 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:15.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:15 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/2860548019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:15 compute-1 ceph-mon[80754]: pgmap v1284: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:55:15.927 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:55:15.927 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:55:15.927 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:16 compute-1 nova_compute[225815]: 2025-11-29 06:55:16.104 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:55:16 compute-1 nova_compute[225815]: 2025-11-29 06:55:16.104 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:55:16 compute-1 nova_compute[225815]: 2025-11-29 06:55:16.160 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:16 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:55:16 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/595463311' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:16 compute-1 nova_compute[225815]: 2025-11-29 06:55:16.598 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:16 compute-1 nova_compute[225815]: 2025-11-29 06:55:16.606 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:55:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:16.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:16 compute-1 nova_compute[225815]: 2025-11-29 06:55:16.924 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:55:16 compute-1 nova_compute[225815]: 2025-11-29 06:55:16.926 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:55:16 compute-1 nova_compute[225815]: 2025-11-29 06:55:16.927 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:17 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/3008667807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:17 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2736452741' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:17 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/595463311' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:17.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:18 compute-1 podman[231769]: 2025-11-29 06:55:18.318750921 +0000 UTC m=+0.054730405 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 06:55:18 compute-1 podman[231768]: 2025-11-29 06:55:18.319169052 +0000 UTC m=+0.061320161 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:55:18 compute-1 ceph-mon[80754]: pgmap v1285: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:18 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/570431191' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:55:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:18.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:19.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:19 compute-1 ceph-mon[80754]: pgmap v1286: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:20.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:20 compute-1 sudo[231805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:55:20 compute-1 sudo[231805]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:55:20 compute-1 sudo[231805]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:21 compute-1 sudo[231830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:55:21 compute-1 sudo[231830]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:55:21 compute-1 sudo[231830]: pam_unix(sudo:session): session closed for user root
Nov 29 06:55:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:21.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:21 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:55:21 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:55:21 compute-1 ceph-mon[80754]: pgmap v1287: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:22.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:22 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:23.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:23 compute-1 ceph-mon[80754]: pgmap v1288: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:23 compute-1 sshd-session[231855]: Received disconnect from 93.157.248.178 port 40240:11: Bye Bye [preauth]
Nov 29 06:55:23 compute-1 sshd-session[231855]: Disconnected from authenticating user root 93.157.248.178 port 40240 [preauth]
Nov 29 06:55:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:24.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:25.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:26 compute-1 ceph-mon[80754]: pgmap v1289: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:26.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:27.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:28 compute-1 ceph-mon[80754]: pgmap v1290: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:28.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:29.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:29 compute-1 ceph-mon[80754]: pgmap v1291: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:30.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:31.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:32 compute-1 ceph-mon[80754]: pgmap v1292: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:32.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:32 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:33.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:33 compute-1 ceph-mon[80754]: pgmap v1293: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:34.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:35.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:35 compute-1 ceph-mon[80754]: pgmap v1294: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:36.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:37 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/212983180' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:55:37 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/212983180' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:55:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:37.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:37 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:38 compute-1 ceph-mon[80754]: pgmap v1295: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:38.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:39.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:39 compute-1 ceph-mon[80754]: pgmap v1296: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:55:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:40.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:55:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:41.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:42 compute-1 ceph-mon[80754]: pgmap v1297: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:42.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:42 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:43.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:43 compute-1 ceph-mon[80754]: pgmap v1298: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:44.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:45.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:45 compute-1 podman[231857]: 2025-11-29 06:55:45.383335912 +0000 UTC m=+0.125266131 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:55:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:46.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:46 compute-1 ceph-mon[80754]: pgmap v1299: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:47.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:47 compute-1 ceph-mon[80754]: pgmap v1300: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:48.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:49 compute-1 podman[231884]: 2025-11-29 06:55:49.330205969 +0000 UTC m=+0.064233079 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 29 06:55:49 compute-1 podman[231883]: 2025-11-29 06:55:49.33505049 +0000 UTC m=+0.070807586 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:55:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:49.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:50 compute-1 ceph-mon[80754]: pgmap v1301: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:50.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:51.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:51 compute-1 ceph-mon[80754]: pgmap v1302: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:52.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:53.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:54 compute-1 ceph-mon[80754]: pgmap v1303: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:54.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:55.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:56 compute-1 ceph-mon[80754]: pgmap v1304: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:55:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:56.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:55:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:57.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:57 compute-1 ceph-mon[80754]: pgmap v1305: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:55:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:55:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:58.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:55:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:55:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:55:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:59.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:00 compute-1 ceph-mon[80754]: pgmap v1306: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:00.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:01.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:02.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:02 compute-1 ceph-mon[80754]: pgmap v1307: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:03.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:04 compute-1 ceph-mon[80754]: pgmap v1308: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:04.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:05.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:05 compute-1 nova_compute[225815]: 2025-11-29 06:56:05.493 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 2.01 sec
Nov 29 06:56:05 compute-1 ceph-mon[80754]: pgmap v1309: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:06.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:07.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:07 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:08 compute-1 ceph-mon[80754]: pgmap v1310: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:08.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:09.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:10.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:10 compute-1 ceph-mon[80754]: pgmap v1311: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:11.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:12 compute-1 ceph-mon[80754]: pgmap v1312: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:12.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:13.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:14 compute-1 ceph-mon[80754]: pgmap v1313: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:14.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:15.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:15 compute-1 ceph-mon[80754]: pgmap v1314: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:56:15.929 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:56:15.929 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:56:15.930 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:16 compute-1 podman[231923]: 2025-11-29 06:56:16.366216942 +0000 UTC m=+0.099666172 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 29 06:56:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:16.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:16 compute-1 nova_compute[225815]: 2025-11-29 06:56:16.929 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:16 compute-1 nova_compute[225815]: 2025-11-29 06:56:16.929 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:17.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:17 compute-1 nova_compute[225815]: 2025-11-29 06:56:17.896 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:17 compute-1 nova_compute[225815]: 2025-11-29 06:56:17.897 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:56:17 compute-1 nova_compute[225815]: 2025-11-29 06:56:17.897 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.295 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.296 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.296 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.297 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.297 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.297 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.298 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.298 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.298 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:18 compute-1 ceph-mon[80754]: pgmap v1315: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.499 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.499 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.499 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.500 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.500 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:18.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:18 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:56:18 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3654944032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:18 compute-1 nova_compute[225815]: 2025-11-29 06:56:18.964 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:19 compute-1 nova_compute[225815]: 2025-11-29 06:56:19.126 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:56:19 compute-1 nova_compute[225815]: 2025-11-29 06:56:19.127 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5333MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:56:19 compute-1 nova_compute[225815]: 2025-11-29 06:56:19.128 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:19 compute-1 nova_compute[225815]: 2025-11-29 06:56:19.128 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.381741) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379381806, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2336, "num_deletes": 251, "total_data_size": 5965364, "memory_usage": 6051152, "flush_reason": "Manual Compaction"}
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Nov 29 06:56:19 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/217955619' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:19 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3654944032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:19 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1431858114' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379406971, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3916609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23685, "largest_seqno": 26016, "table_properties": {"data_size": 3907165, "index_size": 6002, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18818, "raw_average_key_size": 20, "raw_value_size": 3888452, "raw_average_value_size": 4154, "num_data_blocks": 268, "num_entries": 936, "num_filter_entries": 936, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764399149, "oldest_key_time": 1764399149, "file_creation_time": 1764399379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 25302 microseconds, and 7843 cpu microseconds.
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.407035) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3916609 bytes OK
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.407061) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.409415) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.409443) EVENT_LOG_v1 {"time_micros": 1764399379409434, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.409464) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 5955169, prev total WAL file size 5955169, number of live WAL files 2.
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.411701) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3824KB)], [48(9012KB)]
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379411880, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 13145327, "oldest_snapshot_seqno": -1}
Nov 29 06:56:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:19.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5296 keys, 11154254 bytes, temperature: kUnknown
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379520489, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 11154254, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11116401, "index_size": 23535, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13253, "raw_key_size": 134021, "raw_average_key_size": 25, "raw_value_size": 11017939, "raw_average_value_size": 2080, "num_data_blocks": 968, "num_entries": 5296, "num_filter_entries": 5296, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764399379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.520820) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 11154254 bytes
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.522406) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 120.9 rd, 102.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 8.8 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(6.2) write-amplify(2.8) OK, records in: 5813, records dropped: 517 output_compression: NoCompression
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.522424) EVENT_LOG_v1 {"time_micros": 1764399379522414, "job": 28, "event": "compaction_finished", "compaction_time_micros": 108717, "compaction_time_cpu_micros": 31232, "output_level": 6, "num_output_files": 1, "total_output_size": 11154254, "num_input_records": 5813, "num_output_records": 5296, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379523714, "job": 28, "event": "table_file_deletion", "file_number": 50}
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379527566, "job": 28, "event": "table_file_deletion", "file_number": 48}
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.411496) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.527763) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.527771) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.527775) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.527778) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-1 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.527780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 06:56:19 compute-1 nova_compute[225815]: 2025-11-29 06:56:19.733 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:56:19 compute-1 nova_compute[225815]: 2025-11-29 06:56:19.734 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:56:19 compute-1 nova_compute[225815]: 2025-11-29 06:56:19.747 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing inventories for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 06:56:19 compute-1 nova_compute[225815]: 2025-11-29 06:56:19.977 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Updating ProviderTree inventory for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 06:56:19 compute-1 nova_compute[225815]: 2025-11-29 06:56:19.977 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Updating inventory in ProviderTree for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:56:20 compute-1 nova_compute[225815]: 2025-11-29 06:56:20.091 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing aggregate associations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 06:56:20 compute-1 nova_compute[225815]: 2025-11-29 06:56:20.110 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing trait associations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSSE3,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 06:56:20 compute-1 nova_compute[225815]: 2025-11-29 06:56:20.128 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:20 compute-1 podman[231972]: 2025-11-29 06:56:20.330517528 +0000 UTC m=+0.069919892 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:56:20 compute-1 podman[231974]: 2025-11-29 06:56:20.336330334 +0000 UTC m=+0.081998376 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 06:56:20 compute-1 ceph-mon[80754]: pgmap v1316: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:20 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:56:20 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3108214857' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:20 compute-1 nova_compute[225815]: 2025-11-29 06:56:20.591 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:20 compute-1 nova_compute[225815]: 2025-11-29 06:56:20.598 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:56:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:20.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:20 compute-1 nova_compute[225815]: 2025-11-29 06:56:20.872 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:56:20 compute-1 nova_compute[225815]: 2025-11-29 06:56:20.875 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:56:20 compute-1 nova_compute[225815]: 2025-11-29 06:56:20.876 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:21 compute-1 sudo[232029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:21 compute-1 sudo[232029]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:21 compute-1 sudo[232029]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:21 compute-1 sudo[232054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:56:21 compute-1 sudo[232054]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:21 compute-1 sudo[232054]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:21.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:21 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2239183731' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:21 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3108214857' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:21 compute-1 ceph-mon[80754]: pgmap v1317: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:21 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/886911526' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:56:21 compute-1 sudo[232079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:21 compute-1 sudo[232079]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:21 compute-1 sudo[232079]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:21 compute-1 sudo[232104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:56:21 compute-1 sudo[232104]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:21 compute-1 sudo[232104]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:22.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:22 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:56:22 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:56:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:23.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:56:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:56:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:56:23 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:56:23 compute-1 ceph-mon[80754]: pgmap v1318: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:24.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:25.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:26 compute-1 ceph-mon[80754]: pgmap v1319: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:26.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:27.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:27 compute-1 ceph-mon[80754]: pgmap v1320: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:28.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:29.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:29 compute-1 ceph-mon[80754]: pgmap v1321: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:30 compute-1 sshd-session[232159]: Invalid user elemental from 93.157.248.178 port 45070
Nov 29 06:56:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:30.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:30 compute-1 sshd-session[232159]: Received disconnect from 93.157.248.178 port 45070:11: Bye Bye [preauth]
Nov 29 06:56:30 compute-1 sshd-session[232159]: Disconnected from invalid user elemental 93.157.248.178 port 45070 [preauth]
Nov 29 06:56:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:31.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:31 compute-1 sudo[232161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:56:31 compute-1 sudo[232161]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:31 compute-1 sudo[232161]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:31 compute-1 sudo[232186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:56:31 compute-1 sudo[232186]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:56:31 compute-1 sudo[232186]: pam_unix(sudo:session): session closed for user root
Nov 29 06:56:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:32.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:32 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:33 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:56:33 compute-1 ceph-mon[80754]: pgmap v1322: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:33 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:56:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:33.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:34.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:34 compute-1 ceph-mon[80754]: pgmap v1323: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:35.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:36 compute-1 ceph-mon[80754]: pgmap v1324: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:36.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 06:56:36 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1564014978' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:56:36 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 06:56:36 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1564014978' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:56:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:37.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:37 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:38.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:39.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:40 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1564014978' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:56:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:40.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:41.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:41 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1564014978' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:56:41 compute-1 ceph-mon[80754]: pgmap v1325: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:41 compute-1 ceph-mon[80754]: pgmap v1326: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:42.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:42 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:43 compute-1 ceph-mon[80754]: pgmap v1327: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:43.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:44 compute-1 ceph-mon[80754]: pgmap v1328: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:44.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:45.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:46 compute-1 ceph-mon[80754]: pgmap v1329: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:56:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:46.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:56:47 compute-1 podman[232211]: 2025-11-29 06:56:47.360283506 +0000 UTC m=+0.101129332 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:56:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:47.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:47 compute-1 ceph-mon[80754]: pgmap v1330: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:48.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:49.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:50.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:51 compute-1 podman[232239]: 2025-11-29 06:56:51.325395814 +0000 UTC m=+0.062652297 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 06:56:51 compute-1 podman[232238]: 2025-11-29 06:56:51.337339546 +0000 UTC m=+0.069544003 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 06:56:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:51.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:52.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:53.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:54.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:55.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:56 compute-1 ceph-mon[80754]: pgmap v1331: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:56.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:56:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:57.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:56:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:56:58 compute-1 ceph-mon[80754]: pgmap v1332: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:58 compute-1 ceph-mon[80754]: pgmap v1333: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:58 compute-1 ceph-mon[80754]: pgmap v1334: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:56:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:58.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:56:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:56:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:59.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:56:59 compute-1 ceph-mon[80754]: pgmap v1335: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:00.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:01 compute-1 ceph-mon[80754]: pgmap v1336: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:01.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:02 compute-1 ceph-mon[80754]: pgmap v1337: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:02.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:03.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:03 compute-1 ceph-mon[80754]: pgmap v1338: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:04.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:05.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:05 compute-1 ceph-mon[80754]: pgmap v1339: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:06.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:07 compute-1 ceph-mon[80754]: pgmap v1340: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:07.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:07 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:08.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:09.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:10 compute-1 ceph-mon[80754]: pgmap v1341: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:10.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:11.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:12 compute-1 ceph-mon[80754]: pgmap v1342: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:12.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:13.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:13 compute-1 ceph-mon[80754]: pgmap v1343: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:14.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:15.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:57:15.929 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:57:15.929 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:57:15.930 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:16 compute-1 ceph-mon[80754]: pgmap v1344: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:16.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:17.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:18 compute-1 ceph-mon[80754]: pgmap v1345: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:18 compute-1 podman[232277]: 2025-11-29 06:57:18.382892573 +0000 UTC m=+0.129847724 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Nov 29 06:57:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:18.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:19.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:19 compute-1 ceph-mon[80754]: pgmap v1346: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:20.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:20 compute-1 nova_compute[225815]: 2025-11-29 06:57:20.878 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:20 compute-1 nova_compute[225815]: 2025-11-29 06:57:20.878 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:20 compute-1 nova_compute[225815]: 2025-11-29 06:57:20.878 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:57:20 compute-1 nova_compute[225815]: 2025-11-29 06:57:20.879 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:57:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:21.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:21 compute-1 ceph-mon[80754]: pgmap v1347: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:22 compute-1 podman[232304]: 2025-11-29 06:57:22.309536597 +0000 UTC m=+0.049087152 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:57:22 compute-1 podman[232305]: 2025-11-29 06:57:22.316419062 +0000 UTC m=+0.050107819 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:57:22 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:22.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:23.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:24 compute-1 ceph-mon[80754]: pgmap v1348: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:24.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:25.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:25 compute-1 ceph-mon[80754]: pgmap v1349: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:26 compute-1 nova_compute[225815]: 2025-11-29 06:57:26.571 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:57:26 compute-1 nova_compute[225815]: 2025-11-29 06:57:26.571 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-1 nova_compute[225815]: 2025-11-29 06:57:26.572 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-1 nova_compute[225815]: 2025-11-29 06:57:26.572 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-1 nova_compute[225815]: 2025-11-29 06:57:26.572 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-1 nova_compute[225815]: 2025-11-29 06:57:26.572 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-1 nova_compute[225815]: 2025-11-29 06:57:26.573 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-1 nova_compute[225815]: 2025-11-29 06:57:26.573 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:57:26 compute-1 nova_compute[225815]: 2025-11-29 06:57:26.573 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:26.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:27 compute-1 nova_compute[225815]: 2025-11-29 06:57:27.017 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:27 compute-1 nova_compute[225815]: 2025-11-29 06:57:27.017 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:27 compute-1 nova_compute[225815]: 2025-11-29 06:57:27.017 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:27 compute-1 nova_compute[225815]: 2025-11-29 06:57:27.017 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:57:27 compute-1 nova_compute[225815]: 2025-11-29 06:57:27.018 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:57:27 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2616126904' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:27 compute-1 nova_compute[225815]: 2025-11-29 06:57:27.483 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:27.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:27 compute-1 nova_compute[225815]: 2025-11-29 06:57:27.649 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:57:27 compute-1 nova_compute[225815]: 2025-11-29 06:57:27.650 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5352MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:57:27 compute-1 nova_compute[225815]: 2025-11-29 06:57:27.651 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:27 compute-1 nova_compute[225815]: 2025-11-29 06:57:27.651 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:27 compute-1 ceph-mon[80754]: pgmap v1350: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:28 compute-1 nova_compute[225815]: 2025-11-29 06:57:28.037 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:57:28 compute-1 nova_compute[225815]: 2025-11-29 06:57:28.037 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:57:28 compute-1 nova_compute[225815]: 2025-11-29 06:57:28.052 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:28 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:57:28 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3901645275' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:28 compute-1 nova_compute[225815]: 2025-11-29 06:57:28.495 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:28 compute-1 nova_compute[225815]: 2025-11-29 06:57:28.502 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:57:28 compute-1 nova_compute[225815]: 2025-11-29 06:57:28.551 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:57:28 compute-1 nova_compute[225815]: 2025-11-29 06:57:28.553 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:57:28 compute-1 nova_compute[225815]: 2025-11-29 06:57:28.554 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.903s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:28.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:29 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/2616126904' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1769306037' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2389936592' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3901645275' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/526359832' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/1141100869' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:57:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:29.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:30 compute-1 ceph-mon[80754]: pgmap v1351: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:30.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:31.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:31 compute-1 sudo[232387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:31 compute-1 sudo[232387]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:31 compute-1 sudo[232387]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:31 compute-1 sudo[232412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:57:31 compute-1 sudo[232412]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:31 compute-1 sudo[232412]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:32 compute-1 sudo[232437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:32 compute-1 sudo[232437]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:32 compute-1 sudo[232437]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:32 compute-1 sudo[232462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:57:32 compute-1 sudo[232462]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:32 compute-1 ceph-mon[80754]: pgmap v1352: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:32 compute-1 sudo[232462]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:32 compute-1 nova_compute[225815]: 2025-11-29 06:57:32.638 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:32 compute-1 nova_compute[225815]: 2025-11-29 06:57:32.639 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:32 compute-1 nova_compute[225815]: 2025-11-29 06:57:32.767 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:32 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:32.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:33.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:33 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:57:33 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 06:57:33 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:57:33 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Nov 29 06:57:33 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 06:57:33 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:57:34 compute-1 ceph-mon[80754]: pgmap v1353: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:34 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:34 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:34 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:34.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:35.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:35 compute-1 ceph-mon[80754]: pgmap v1354: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:36 compute-1 sshd-session[232518]: Received disconnect from 93.157.248.178 port 60444:11: Bye Bye [preauth]
Nov 29 06:57:36 compute-1 sshd-session[232518]: Disconnected from authenticating user root 93.157.248.178 port 60444 [preauth]
Nov 29 06:57:36 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:36 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:36 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:36.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:37.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:37 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:38 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/2021761786' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:57:38 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/2021761786' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:57:38 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:38 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:38 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:38.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:39.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:40 compute-1 ceph-mon[80754]: pgmap v1355: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:40 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:40 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:40 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:40.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:41 compute-1 ceph-mon[80754]: pgmap v1356: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:41.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:42 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:42 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:42 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:42 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:42.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:43.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:44 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:44 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:57:44 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:44.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:57:45 compute-1 ceph-mon[80754]: pgmap v1357: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:45.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:46 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:46 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:46 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:46.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:47.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:47 compute-1 ceph-mon[80754]: pgmap v1358: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:47 compute-1 ceph-mon[80754]: pgmap v1359: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:48 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:48 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:48 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:48.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:49 compute-1 podman[232520]: 2025-11-29 06:57:49.345331757 +0000 UTC m=+0.088706728 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:57:49 compute-1 ceph-mon[80754]: pgmap v1360: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:49.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:50 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:50 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:50 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:50.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:51.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:52 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:52 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:52 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:52.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:53 compute-1 ceph-mon[80754]: pgmap v1361: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:53 compute-1 podman[232549]: 2025-11-29 06:57:53.323414713 +0000 UTC m=+0.056874231 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 06:57:53 compute-1 podman[232548]: 2025-11-29 06:57:53.350467711 +0000 UTC m=+0.089248743 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:57:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:53.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:54 compute-1 ceph-mon[80754]: pgmap v1362: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:54 compute-1 ceph-mon[80754]: pgmap v1363: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:54 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:54 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:54 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:54.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:55.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:56 compute-1 ceph-mon[80754]: pgmap v1364: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:56 compute-1 sudo[232585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:57:56 compute-1 sudo[232585]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:56 compute-1 sudo[232585]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:56 compute-1 sudo[232610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 29 06:57:56 compute-1 sudo[232610]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:57:56 compute-1 sudo[232610]: pam_unix(sudo:session): session closed for user root
Nov 29 06:57:56 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:56 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:56 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:56.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:57.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:57:57 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:57:57 compute-1 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 06:57:57 compute-1 ceph-mon[80754]: pgmap v1365: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:57:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:57:58 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:58 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:57:58 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:58.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:57:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:57:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:57:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:59.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:00 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:00 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:00 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:00.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:01 compute-1 ceph-mon[80754]: pgmap v1366: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:01.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:02 compute-1 ceph-mon[80754]: pgmap v1367: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:02 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:02 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:02 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:02.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:03.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:03 compute-1 ceph-mon[80754]: pgmap v1368: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:04 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:04 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:04 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:04.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:05.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:06 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:06 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:06 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:06.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:07 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:07 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:07 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:07.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:07 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:08 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:08 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:08 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:08.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:09 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:09 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:09 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:09.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:10 compute-1 ceph-mon[80754]: pgmap v1369: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:10 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:10 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:10 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:10.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:11 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:11 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:11 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:11.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:12 compute-1 ceph-mon[80754]: pgmap v1370: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:12 compute-1 ceph-mon[80754]: pgmap v1371: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:12 compute-1 ceph-mon[80754]: pgmap v1372: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:12 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:12 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:12 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:12 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:12.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:13 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:13 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:13 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:13.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:14 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:14 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:14 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:14.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:15 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:15 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:15 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:15.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:58:15.930 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:58:15.930 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:15 compute-1 ovn_metadata_agent[139241]: 2025-11-29 06:58:15.931 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:16 compute-1 ceph-mon[80754]: pgmap v1373: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:16 compute-1 ceph-mon[80754]: pgmap v1374: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:16 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:16 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:16 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:16.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:17 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:17 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:17 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:17.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:17 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:18 compute-1 ceph-mon[80754]: pgmap v1375: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:18 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:18 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:18 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:18.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:19 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:19 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:19 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:19.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:19 compute-1 nova_compute[225815]: 2025-11-29 06:58:19.968 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:20 compute-1 ceph-mon[80754]: pgmap v1376: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:20 compute-1 podman[232635]: 2025-11-29 06:58:20.358653699 +0000 UTC m=+0.103573907 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller)
Nov 29 06:58:20 compute-1 nova_compute[225815]: 2025-11-29 06:58:20.967 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:20 compute-1 nova_compute[225815]: 2025-11-29 06:58:20.967 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:58:20 compute-1 nova_compute[225815]: 2025-11-29 06:58:20.967 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:58:20 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:20 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:20 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:20.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:21 compute-1 nova_compute[225815]: 2025-11-29 06:58:21.375 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:58:21 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:21 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:21 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:21.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:22 compute-1 ceph-mon[80754]: pgmap v1377: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:22 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:22 compute-1 nova_compute[225815]: 2025-11-29 06:58:22.966 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:22 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:22 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:22 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:22.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:23 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:23 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:23 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:23.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:23 compute-1 nova_compute[225815]: 2025-11-29 06:58:23.967 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:23 compute-1 nova_compute[225815]: 2025-11-29 06:58:23.967 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:58:24 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/705562816' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:24 compute-1 ceph-mon[80754]: pgmap v1378: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:24 compute-1 podman[232662]: 2025-11-29 06:58:24.334521175 +0000 UTC m=+0.077089144 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, tcib_managed=true)
Nov 29 06:58:24 compute-1 podman[232663]: 2025-11-29 06:58:24.349997972 +0000 UTC m=+0.087028392 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 06:58:24 compute-1 nova_compute[225815]: 2025-11-29 06:58:24.967 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:24 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:24 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:24 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:24.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:25 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3131628032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:25 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2010625727' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:25 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:25 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:25 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:25.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:25 compute-1 nova_compute[225815]: 2025-11-29 06:58:25.648 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:25 compute-1 nova_compute[225815]: 2025-11-29 06:58:25.649 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:25 compute-1 nova_compute[225815]: 2025-11-29 06:58:25.649 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:25 compute-1 nova_compute[225815]: 2025-11-29 06:58:25.649 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:58:25 compute-1 nova_compute[225815]: 2025-11-29 06:58:25.649 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:58:26 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2138341124' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.139 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.339 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.340 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5342MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.341 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.342 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.449 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.450 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.467 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:26 compute-1 ceph-mon[80754]: pgmap v1379: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:26 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/2138341124' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:26 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 06:58:26 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2021454943' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.924 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.930 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.953 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.955 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.956 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:26 compute-1 nova_compute[225815]: 2025-11-29 06:58:26.957 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:26 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:26 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:26 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:26.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:27 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:27 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:27 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:27.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:27 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:27 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/3947950218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:27 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/2021454943' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 06:58:27 compute-1 ceph-mon[80754]: pgmap v1380: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:27 compute-1 nova_compute[225815]: 2025-11-29 06:58:27.975 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:27 compute-1 nova_compute[225815]: 2025-11-29 06:58:27.975 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:28 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:28 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:28 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:28.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:29 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:29 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:29 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:29.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:29 compute-1 nova_compute[225815]: 2025-11-29 06:58:29.966 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:30 compute-1 nova_compute[225815]: 2025-11-29 06:58:30.962 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:30 compute-1 ceph-mon[80754]: pgmap v1381: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:30 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:30 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:30 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:30.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:31 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:31 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:31 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:31.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:32 compute-1 ceph-mon[80754]: pgmap v1382: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:32 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:32 compute-1 nova_compute[225815]: 2025-11-29 06:58:32.967 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:32 compute-1 nova_compute[225815]: 2025-11-29 06:58:32.968 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 06:58:32 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:32 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:32 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:32.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:33 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:33 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:33 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:33.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:33 compute-1 ceph-mon[80754]: pgmap v1383: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:34 compute-1 sshd-session[232745]: Accepted publickey for zuul from 192.168.122.10 port 56398 ssh2: ECDSA SHA256:q0RMlXdalxA6snNWza7TmIndlwLWLLpO+sXhiGKqO/I
Nov 29 06:58:34 compute-1 systemd-logind[785]: New session 51 of user zuul.
Nov 29 06:58:34 compute-1 systemd[1]: Started Session 51 of User zuul.
Nov 29 06:58:34 compute-1 sshd-session[232745]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:58:34 compute-1 sudo[232749]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 29 06:58:34 compute-1 sudo[232749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:58:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:35.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:35 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:35 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:35 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:35.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:36 compute-1 nova_compute[225815]: 2025-11-29 06:58:36.172 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:36 compute-1 nova_compute[225815]: 2025-11-29 06:58:36.174 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 06:58:36 compute-1 nova_compute[225815]: 2025-11-29 06:58:36.197 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 06:58:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:58:36 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.1 total, 600.0 interval
                                           Cumulative writes: 7210 writes, 27K keys, 7210 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 7210 writes, 1584 syncs, 4.55 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 427 writes, 658 keys, 427 commit groups, 1.0 writes per commit group, ingest: 0.21 MB, 0.00 MB/s
                                           Interval WAL: 427 writes, 198 syncs, 2.16 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 06:58:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:37.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:37 compute-1 ceph-mon[80754]: pgmap v1384: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:37 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:37 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:37 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:37.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:37 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:38 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Nov 29 06:58:38 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1573460643' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 29 06:58:38 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1396494908' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 06:58:38 compute-1 ceph-mon[80754]: from='client.? 192.168.122.10:0/1396494908' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 06:58:38 compute-1 ceph-mon[80754]: from='client.24755 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:38 compute-1 ceph-mon[80754]: pgmap v1385: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:39.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:39 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:39 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:39 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:39.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:39 compute-1 ceph-mon[80754]: from='client.24761 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:39 compute-1 ceph-mon[80754]: from='client.14961 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:39 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1573460643' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 29 06:58:39 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1977521267' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 29 06:58:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:41.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:41 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:41 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000052s ======
Nov 29 06:58:41 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:41.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Nov 29 06:58:42 compute-1 ceph-mon[80754]: from='client.14967 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:42 compute-1 ceph-mon[80754]: from='client.24820 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:42 compute-1 ceph-mon[80754]: pgmap v1386: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:42 compute-1 ceph-mon[80754]: from='client.24826 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:42 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/857796520' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 29 06:58:42 compute-1 sshd-session[233047]: Invalid user sammy from 93.157.248.178 port 60128
Nov 29 06:58:42 compute-1 sshd-session[233047]: Received disconnect from 93.157.248.178 port 60128:11: Bye Bye [preauth]
Nov 29 06:58:42 compute-1 sshd-session[233047]: Disconnected from invalid user sammy 93.157.248.178 port 60128 [preauth]
Nov 29 06:58:42 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:43.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:43 compute-1 ovs-vsctl[233078]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 29 06:58:43 compute-1 ceph-mon[80754]: pgmap v1387: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:43 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:43 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:43 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:43.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:44 compute-1 virtqemud[225339]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 29 06:58:44 compute-1 virtqemud[225339]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 29 06:58:44 compute-1 virtqemud[225339]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 29 06:58:44 compute-1 sshd-session[233125]: Invalid user tester from 66.94.122.234 port 51468
Nov 29 06:58:44 compute-1 sshd-session[233125]: Received disconnect from 66.94.122.234 port 51468:11: Bye Bye [preauth]
Nov 29 06:58:44 compute-1 sshd-session[233125]: Disconnected from invalid user tester 66.94.122.234 port 51468 [preauth]
Nov 29 06:58:44 compute-1 ceph-mon[80754]: pgmap v1388: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:44 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: cache status {prefix=cache status} (starting...)
Nov 29 06:58:44 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:45.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:45 compute-1 lvm[233381]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 06:58:45 compute-1 lvm[233381]: VG ceph_vg0 finished
Nov 29 06:58:45 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: client ls {prefix=client ls} (starting...)
Nov 29 06:58:45 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:45 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:45 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:45 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:45.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:45 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: damage ls {prefix=damage ls} (starting...)
Nov 29 06:58:45 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:45 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump loads {prefix=dump loads} (starting...)
Nov 29 06:58:45 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:45 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Nov 29 06:58:45 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1276981983' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:46 compute-1 ceph-mon[80754]: pgmap v1389: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:46 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Nov 29 06:58:46 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/135313544' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: get subtrees {prefix=get subtrees} (starting...)
Nov 29 06:58:46 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:47.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:47 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: ops {prefix=ops} (starting...)
Nov 29 06:58:47 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Nov 29 06:58:47 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1029441042' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 29 06:58:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Nov 29 06:58:47 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3853464868' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 29 06:58:47 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:47 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:47 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:47.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Nov 29 06:58:47 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1898843982' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Nov 29 06:58:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Nov 29 06:58:47 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/961495441' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:58:47 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: session ls {prefix=session ls} (starting...)
Nov 29 06:58:47 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 06:58:47 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:47 compute-1 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: status {prefix=status} (starting...)
Nov 29 06:58:48 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Nov 29 06:58:48 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2721476524' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:58:48 compute-1 ceph-mon[80754]: from='client.24776 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:48 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1276981983' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:48 compute-1 ceph-mon[80754]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:48 compute-1 ceph-mon[80754]: from='client.24835 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:48 compute-1 ceph-mon[80754]: from='client.24841 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:48 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/135313544' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:58:48 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2741286316' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:48 compute-1 ceph-mon[80754]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:49.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Nov 29 06:58:49 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/684679155' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Nov 29 06:58:49 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3910457812' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Nov 29 06:58:49 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1704439386' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:58:49 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:49 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:49 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:49.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.24853 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.24803 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.14982 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: pgmap v1390: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1029441042' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/1529673451' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3853464868' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.24880 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/4200116878' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1898843982' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/961495441' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/3706317250' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/352369704' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.24836 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/2721476524' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/446556486' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.14994 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.24910 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.24848 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.24916 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: pgmap v1391: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2498518273' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/684679155' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3910457812' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2393270283' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.15018 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1704439386' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Nov 29 06:58:49 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/532443902' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 29 06:58:49 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Nov 29 06:58:49 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4111572753' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:58:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Nov 29 06:58:50 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2877614875' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 29 06:58:50 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Nov 29 06:58:50 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2517383288' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 29 06:58:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:51.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:51 compute-1 podman[234139]: 2025-11-29 06:58:51.369827739 +0000 UTC m=+0.100615568 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 134 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133) [1] r=-1 lpr=133 pi=[78,133)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1e] failed. State was: not registered w/ OSD
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69623808 unmapped: 1540096 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.10 scrub starts
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:02.376413+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  log_queue is 3 last_log 221 sent 220 num 3 unsent 1 sending 1
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  will send 2025-11-29T06:25:32.220457+0000 osd.0 (osd.0) 221 : cluster [DBG] 9.10 scrub starts
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 134 heartbeat osd_stat(store_statfs(0x1bca1f000/0x0/0x1bfc00000, data 0x13d19c/0x1fe000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 909376 data_alloc: 285212672 data_used: 393216
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69632000 unmapped: 1531904 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.10 scrub ok
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 134 pg[9.1e( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133) [1] r=-1 lpr=133 DELETING pi=[78,133)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 1.245882 2 0.000271
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 134 pg[9.1e( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133) [1] r=-1 lpr=133 pi=[78,133)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.246134 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 134 pg[9.1e( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133) [1] r=-1 lpr=133 pi=[78,133)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 6.692822 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1e] failed. State was: not registered w/ OSD
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:03.376653+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  log_queue is 4 last_log 222 sent 221 num 4 unsent 1 sending 1
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  will send 2025-11-29T06:25:33.258119+0000 osd.0 (osd.0) 222 : cluster [DBG] 9.10 scrub ok
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 134 heartbeat osd_stat(store_statfs(0x1bca20000/0x0/0x1bfc00000, data 0x13d19c/0x1fe000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69632000 unmapped: 1531904 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:04.376924+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69632000 unmapped: 1531904 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:05.377162+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69640192 unmapped: 1523712 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:06.377439+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69640192 unmapped: 1523712 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 134 heartbeat osd_stat(store_statfs(0x1bca20000/0x0/0x1bfc00000, data 0x13d19c/0x1fe000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.11 scrub starts
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.325085640s of 12.187417030s, submitted: 25
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.11 scrub ok
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:07.377646+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  log_queue is 6 last_log 224 sent 222 num 6 unsent 2 sending 2
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  will send 2025-11-29T06:25:37.316830+0000 osd.0 (osd.0) 223 : cluster [DBG] 9.11 scrub starts
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  will send 2025-11-29T06:25:37.348575+0000 osd.0 (osd.0) 224 : cluster [DBG] 9.11 scrub ok
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 899808 data_alloc: 285212672 data_used: 393216
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69648384 unmapped: 1515520 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:08.377892+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69648384 unmapped: 1515520 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:09.378091+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69648384 unmapped: 1515520 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:10.378361+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=98) [0] r=0 lpr=98 crt=56'1130 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 131.103333 122 0.000815
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=98) [0] r=0 lpr=98 crt=56'1130 mlcod 0'0 active mbc={}] exit Started/Primary/Active 131.109797 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=98) [0] r=0 lpr=98 crt=56'1130 mlcod 0'0 active mbc={}] exit Started/Primary 134.366278 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=98) [0] r=0 lpr=98 crt=56'1130 mlcod 0'0 active mbc={}] exit Started 134.366639 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=98) [0] r=0 lpr=98 crt=56'1130 mlcod 0'0 active mbc={}] enter Reset
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.898561478s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 active pruub 437.762023926s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] exit Reset 0.001853 1 0.002531
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] enter Started
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] enter Start
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] exit Start 0.000102 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] enter Started/Stray
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 135 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69664768 unmapped: 1499136 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client handle_log_ack log(last 220) v1
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  logged 2025-11-29T06:25:30.265301+0000 osd.0 (osd.0) 219 : cluster [DBG] 9.1f deep-scrub starts
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  logged 2025-11-29T06:25:30.296946+0000 osd.0 (osd.0) 220 : cluster [DBG] 9.1f deep-scrub ok
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client handle_log_ack log(last 221) v1
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  logged 2025-11-29T06:25:32.220457+0000 osd.0 (osd.0) 221 : cluster [DBG] 9.10 scrub starts
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client handle_log_ack log(last 222) v1
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  logged 2025-11-29T06:25:33.258119+0000 osd.0 (osd.0) 222 : cluster [DBG] 9.10 scrub ok
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:11.378583+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69664768 unmapped: 1499136 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:12.378748+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 903726 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69664768 unmapped: 1499136 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 135 heartbeat osd_stat(store_statfs(0x1bca1c000/0x0/0x1bfc00000, data 0x13edf5/0x201000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:13.378937+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69664768 unmapped: 1499136 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:14.379082+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69672960 unmapped: 1490944 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:15.379238+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 135 heartbeat osd_stat(store_statfs(0x1bca1c000/0x0/0x1bfc00000, data 0x13edf5/0x201000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69672960 unmapped: 1490944 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:16.379399+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client handle_log_ack log(last 224) v1
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  logged 2025-11-29T06:25:37.316830+0000 osd.0 (osd.0) 223 : cluster [DBG] 9.11 scrub starts
Nov 29 06:58:51 compute-1 ceph-osd[78089]: log_client  logged 2025-11-29T06:25:37.348575+0000 osd.0 (osd.0) 224 : cluster [DBG] 9.11 scrub ok
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 5.847332 3 0.000221
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 5.847530 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] enter Reset
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] exit Reset 0.000124 1 0.000194
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] enter Started
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] enter Start
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] exit Start 0.000008 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] enter Started/Primary
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] enter Started/Primary/Peering
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetInfo
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000042 1 0.000051
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetLog
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetLog 0.000029 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetMissing
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69681152 unmapped: 1482752 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 136 heartbeat osd_stat(store_statfs(0x1bca1c000/0x0/0x1bfc00000, data 0x13edf5/0x201000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:17.379573+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 906700 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69681152 unmapped: 1482752 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:18.379792+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69681152 unmapped: 1482752 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:19.379964+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69689344 unmapped: 1474560 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:20.380156+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69689344 unmapped: 1474560 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:21.380414+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.183051109s of 14.367411613s, submitted: 13
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/WaitUpThru 4.780247 4 0.000078
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering 4.780398 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] enter Started/Primary/Active
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 activating+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/Activating
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69689344 unmapped: 1474560 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:22.380580+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 909674 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 137 heartbeat osd_stat(store_statfs(0x1bca16000/0x0/0x1bfc00000, data 0x1425b0/0x207000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69697536 unmapped: 1466368 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/Activating 1.520142 5 0.001220
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000220 1 0.000260
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000973 1 0.000066
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/Recovering
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:23.380794+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69697536 unmapped: 1466368 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:24.380966+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovering 1.535533 2 0.000170
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] enter Started/Primary/Active/Recovered
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69697536 unmapped: 1466368 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:25.381142+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69705728 unmapped: 1458176 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:26.381427+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 137 heartbeat osd_stat(store_statfs(0x1bca16000/0x0/0x1bfc00000, data 0x1425b0/0x207000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69713920 unmapped: 1449984 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:27.381662+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 909834 data_alloc: 285212672 data_used: 405504
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69713920 unmapped: 1449984 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:28.381983+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69713920 unmapped: 1449984 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:29.382247+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69722112 unmapped: 1441792 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:30.382546+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _renew_subs
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69722112 unmapped: 1441792 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovered 6.626645 2 0.000147
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] exit Started/Primary/Active 9.684700 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] exit Started/Primary 14.465409 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] exit Started 14.465553 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] enter Reset
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.835290909s) [1] async=[1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 56'1130 active pruub 461.014038086s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] exit Reset 0.000776 1 0.001470
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] enter Started
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] enter Start
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] exit Start 0.000092 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] enter Started/Stray
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:31.382693+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 138 heartbeat osd_stat(store_statfs(0x1bca16000/0x0/0x1bfc00000, data 0x1425b0/0x207000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69738496 unmapped: 1425408 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:32.383005+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.235433578s of 11.264188766s, submitted: 8
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 912808 data_alloc: 285212672 data_used: 405504
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69746688 unmapped: 1417216 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:33.383210+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69746688 unmapped: 1417216 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:34.383365+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 138 heartbeat osd_stat(store_statfs(0x1bca13000/0x0/0x1bfc00000, data 0x1440d4/0x20a000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 138 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69746688 unmapped: 1417216 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca13000/0x0/0x1bfc00000, data 0x1440d4/0x20a000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:35.383632+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69754880 unmapped: 1409024 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:36.383907+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 5.380763 7 0.000523
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000109 1 0.000112
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: not registered w/ OSD
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69754880 unmapped: 1409024 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:37.384168+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 916454 data_alloc: 285212672 data_used: 405504
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69763072 unmapped: 1400832 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:38.384416+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69763072 unmapped: 1400832 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:39.384624+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 DELETING pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 2.720789 2 0.000394
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 2.720963 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 8.101946 0 0.000000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: not registered w/ OSD
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69763072 unmapped: 1400832 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:40.384847+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69763072 unmapped: 1400832 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:41.385109+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69779456 unmapped: 1384448 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:42.385410+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69779456 unmapped: 1384448 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:43.385617+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69787648 unmapped: 1376256 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:44.385805+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69795840 unmapped: 1368064 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:45.385985+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69795840 unmapped: 1368064 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:46.386202+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69804032 unmapped: 1359872 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:47.386389+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69804032 unmapped: 1359872 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:48.386554+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69804032 unmapped: 1359872 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:49.386803+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69812224 unmapped: 1351680 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:50.387047+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69812224 unmapped: 1351680 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:51.387394+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69812224 unmapped: 1351680 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:52.387685+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69820416 unmapped: 1343488 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:53.387847+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69820416 unmapped: 1343488 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:54.388139+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69820416 unmapped: 1343488 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:55.388535+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69828608 unmapped: 1335296 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:56.388735+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69828608 unmapped: 1335296 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:57.388946+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69836800 unmapped: 1327104 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:58.389090+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69836800 unmapped: 1327104 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:25:59.389259+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69836800 unmapped: 1327104 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:00.389526+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69844992 unmapped: 1318912 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:01.389734+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69844992 unmapped: 1318912 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:02.389954+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69853184 unmapped: 1310720 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:03.390133+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69861376 unmapped: 1302528 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:04.390325+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69861376 unmapped: 1302528 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:05.390481+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69869568 unmapped: 1294336 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:06.390715+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69869568 unmapped: 1294336 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:07.390973+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69869568 unmapped: 1294336 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:08.391195+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69877760 unmapped: 1286144 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:09.391378+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69877760 unmapped: 1286144 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:10.391569+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69885952 unmapped: 1277952 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:11.391828+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69894144 unmapped: 1269760 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:12.392092+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:13.392450+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69902336 unmapped: 1261568 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:14.392630+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69902336 unmapped: 1261568 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:15.392871+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69902336 unmapped: 1261568 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:16.393074+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69910528 unmapped: 1253376 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:17.393361+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69910528 unmapped: 1253376 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:18.393540+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69910528 unmapped: 1253376 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:19.393817+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69910528 unmapped: 1253376 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:20.394056+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69918720 unmapped: 1245184 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:21.394199+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69918720 unmapped: 1245184 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:22.394410+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69918720 unmapped: 1245184 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:23.394567+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69926912 unmapped: 1236992 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:24.394791+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69926912 unmapped: 1236992 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:25.394939+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69935104 unmapped: 1228800 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:26.395107+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69935104 unmapped: 1228800 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:27.395341+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69943296 unmapped: 1220608 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:28.395531+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69951488 unmapped: 1212416 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:29.395729+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69951488 unmapped: 1212416 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:30.396028+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69951488 unmapped: 1212416 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:31.396482+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69959680 unmapped: 1204224 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:32.396730+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69959680 unmapped: 1204224 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:33.396894+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69967872 unmapped: 1196032 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:34.397042+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69967872 unmapped: 1196032 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:35.397228+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69967872 unmapped: 1196032 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:36.397456+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69976064 unmapped: 1187840 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:37.397629+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69976064 unmapped: 1187840 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:38.397792+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69984256 unmapped: 1179648 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:39.397937+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69984256 unmapped: 1179648 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:40.398139+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69984256 unmapped: 1179648 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:41.398355+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69992448 unmapped: 1171456 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:42.398560+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69992448 unmapped: 1171456 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:43.398784+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69992448 unmapped: 1171456 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:44.399099+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70000640 unmapped: 1163264 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:45.399408+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70000640 unmapped: 1163264 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:46.399721+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70008832 unmapped: 1155072 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:47.399921+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70008832 unmapped: 1155072 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:48.400130+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70008832 unmapped: 1155072 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:49.400388+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70017024 unmapped: 1146880 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:50.400916+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70017024 unmapped: 1146880 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:51.401107+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70017024 unmapped: 1146880 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:52.401400+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70025216 unmapped: 1138688 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:53.401538+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70025216 unmapped: 1138688 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:54.401695+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70033408 unmapped: 1130496 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:55.401964+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70033408 unmapped: 1130496 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:56.402377+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70033408 unmapped: 1130496 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:57.402729+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70041600 unmapped: 1122304 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:58.402907+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70041600 unmapped: 1122304 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:26:59.403055+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70041600 unmapped: 1122304 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:00.403253+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70049792 unmapped: 1114112 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:01.403411+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70049792 unmapped: 1114112 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:02.403659+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70057984 unmapped: 1105920 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:03.404008+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70057984 unmapped: 1105920 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:04.404495+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70066176 unmapped: 1097728 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:05.404954+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70066176 unmapped: 1097728 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:06.405259+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70066176 unmapped: 1097728 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:07.405373+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70074368 unmapped: 1089536 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:08.405573+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70074368 unmapped: 1089536 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:09.405792+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70074368 unmapped: 1089536 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:10.406154+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70082560 unmapped: 1081344 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:11.406438+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70082560 unmapped: 1081344 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:12.406644+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70098944 unmapped: 1064960 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:13.406980+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70098944 unmapped: 1064960 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:14.407352+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70107136 unmapped: 1056768 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:15.407626+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70115328 unmapped: 1048576 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:16.407794+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70115328 unmapped: 1048576 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:17.408037+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70115328 unmapped: 1048576 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:18.408353+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70123520 unmapped: 1040384 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:19.408538+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70131712 unmapped: 1032192 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:20.408712+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70139904 unmapped: 1024000 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:21.408916+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70139904 unmapped: 1024000 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:22.409171+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70148096 unmapped: 1015808 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:23.409391+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70156288 unmapped: 1007616 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:24.409607+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70156288 unmapped: 1007616 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:25.409814+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70156288 unmapped: 1007616 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:26.409983+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70164480 unmapped: 999424 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:27.410249+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70164480 unmapped: 999424 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:28.410506+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70164480 unmapped: 999424 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:29.410790+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70164480 unmapped: 999424 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:30.411118+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70172672 unmapped: 991232 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:31.411574+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70172672 unmapped: 991232 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:32.411832+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70180864 unmapped: 983040 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:33.411997+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70180864 unmapped: 983040 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:34.412164+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70180864 unmapped: 983040 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:35.412400+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70189056 unmapped: 974848 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:36.412672+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70189056 unmapped: 974848 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:37.413217+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70197248 unmapped: 966656 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:38.413618+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70197248 unmapped: 966656 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:39.413829+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70197248 unmapped: 966656 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:40.414212+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70205440 unmapped: 958464 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:41.414517+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70205440 unmapped: 958464 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:42.414701+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70205440 unmapped: 958464 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:43.414936+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70213632 unmapped: 950272 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:44.415146+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70213632 unmapped: 950272 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:45.415384+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70213632 unmapped: 950272 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:46.415532+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70221824 unmapped: 942080 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:47.415695+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70221824 unmapped: 942080 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:48.415849+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70230016 unmapped: 933888 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:49.415998+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70230016 unmapped: 933888 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:50.416185+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70230016 unmapped: 933888 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:51.416471+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70238208 unmapped: 925696 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:52.416673+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70246400 unmapped: 917504 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:53.416885+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70246400 unmapped: 917504 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:54.417104+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70246400 unmapped: 917504 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:55.417386+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70246400 unmapped: 917504 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:56.417524+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70262784 unmapped: 901120 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:57.417743+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70262784 unmapped: 901120 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:58.417901+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70270976 unmapped: 892928 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:27:59.418042+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70270976 unmapped: 892928 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:00.418277+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70270976 unmapped: 892928 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:01.418416+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70279168 unmapped: 884736 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:02.418603+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70279168 unmapped: 884736 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:03.418792+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70279168 unmapped: 884736 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:04.418936+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70287360 unmapped: 876544 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:05.419057+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70287360 unmapped: 876544 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 5840 writes, 25K keys, 5840 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 5840 writes, 940 syncs, 6.21 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 5840 writes, 25K keys, 5840 commit groups, 1.0 writes per commit group, ingest: 19.19 MB, 0.03 MB/s
                                           Interval WAL: 5840 writes, 940 syncs, 6.21 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:06.419263+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70361088 unmapped: 802816 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:07.419409+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70361088 unmapped: 802816 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:08.419518+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70361088 unmapped: 802816 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:09.419699+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70369280 unmapped: 794624 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:10.419905+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70369280 unmapped: 794624 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:11.434118+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70369280 unmapped: 794624 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:12.434335+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70377472 unmapped: 786432 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:13.434506+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70377472 unmapped: 786432 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:14.434757+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70377472 unmapped: 786432 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:15.434934+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70385664 unmapped: 778240 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:16.435119+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70385664 unmapped: 778240 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:17.435377+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70393856 unmapped: 770048 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:18.435590+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70393856 unmapped: 770048 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:19.435786+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70393856 unmapped: 770048 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:20.436012+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70402048 unmapped: 761856 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:21.436146+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70402048 unmapped: 761856 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:22.436388+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70410240 unmapped: 753664 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:23.436579+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70410240 unmapped: 753664 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:24.436748+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70410240 unmapped: 753664 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:25.436924+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70418432 unmapped: 745472 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:26.437215+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70418432 unmapped: 745472 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:27.437396+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70426624 unmapped: 737280 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:28.437592+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70426624 unmapped: 737280 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:29.437768+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70426624 unmapped: 737280 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:30.437951+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70434816 unmapped: 729088 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:31.438096+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70434816 unmapped: 729088 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:32.438341+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70434816 unmapped: 729088 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:33.438560+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70443008 unmapped: 720896 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:34.438744+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70443008 unmapped: 720896 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:35.438927+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70443008 unmapped: 720896 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:36.439078+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70451200 unmapped: 712704 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:37.439244+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70451200 unmapped: 712704 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:38.439428+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70459392 unmapped: 704512 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:39.439651+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70459392 unmapped: 704512 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:40.439925+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70459392 unmapped: 704512 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:41.440369+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70475776 unmapped: 688128 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:42.440574+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70475776 unmapped: 688128 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:43.440708+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70475776 unmapped: 688128 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:44.440910+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70483968 unmapped: 679936 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:45.441113+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70483968 unmapped: 679936 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:46.441251+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70483968 unmapped: 679936 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:47.441507+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70492160 unmapped: 671744 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:48.441734+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70492160 unmapped: 671744 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:49.441936+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:50.442209+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70500352 unmapped: 663552 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:51.442407+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70500352 unmapped: 663552 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:52.442549+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70500352 unmapped: 663552 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:53.442725+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70508544 unmapped: 655360 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:54.442891+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70508544 unmapped: 655360 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:55.443354+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70516736 unmapped: 647168 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:56.444158+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70516736 unmapped: 647168 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:57.444401+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70516736 unmapped: 647168 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:58.444612+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70524928 unmapped: 638976 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:28:59.444901+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70524928 unmapped: 638976 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:00.445169+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70524928 unmapped: 638976 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:01.445439+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70533120 unmapped: 630784 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:02.445818+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70533120 unmapped: 630784 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:03.446064+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70541312 unmapped: 622592 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:04.446283+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70541312 unmapped: 622592 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:05.446821+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70549504 unmapped: 614400 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:06.447008+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70549504 unmapped: 614400 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:07.447373+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70549504 unmapped: 614400 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:08.447645+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70557696 unmapped: 606208 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:09.447881+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70557696 unmapped: 606208 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:10.448104+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70565888 unmapped: 598016 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:11.448312+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70565888 unmapped: 598016 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:12.448572+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70565888 unmapped: 598016 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:13.448748+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70574080 unmapped: 589824 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:14.448973+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70574080 unmapped: 589824 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:15.449212+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70574080 unmapped: 589824 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: handle_config config(27 keys) v1
Nov 29 06:58:51 compute-1 ceph-osd[78089]: set_mon_vals no callback set
Nov 29 06:58:51 compute-1 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_accepted_admin_roles = ResellerAdmin, swiftoperator: Configuration option 'rgw_keystone_accepted_admin_roles' may not be modified at runtime
Nov 29 06:58:51 compute-1 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_accepted_roles = member, Member, admin: Configuration option 'rgw_keystone_accepted_roles' may not be modified at runtime
Nov 29 06:58:51 compute-1 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_admin_domain = default: Configuration option 'rgw_keystone_admin_domain' may not be modified at runtime
Nov 29 06:58:51 compute-1 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_admin_password = 12345678: Configuration option 'rgw_keystone_admin_password' may not be modified at runtime
Nov 29 06:58:51 compute-1 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_admin_project = service: Configuration option 'rgw_keystone_admin_project' may not be modified at runtime
Nov 29 06:58:51 compute-1 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_admin_user = swift: Configuration option 'rgw_keystone_admin_user' may not be modified at runtime
Nov 29 06:58:51 compute-1 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_implicit_tenants = true: Configuration option 'rgw_keystone_implicit_tenants' may not be modified at runtime
Nov 29 06:58:51 compute-1 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_url = https://keystone-internal.openstack.svc:5000: Configuration option 'rgw_keystone_url' may not be modified at runtime
Nov 29 06:58:51 compute-1 ceph-osd[78089]: operator() osd_memory_target cleared (was 5502923980)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _update_cache_settings updated pcm target: 4294967296 pcm min: 134217728 pcm max: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:16.449512+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 589824 heap: 71163904 old mem: 3872179014 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:17.450014+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 581632 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:18.450438+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 581632 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:19.450636+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 565248 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:20.450969+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 565248 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:21.451210+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70606848 unmapped: 557056 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:22.451399+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70606848 unmapped: 557056 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:23.451582+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70606848 unmapped: 557056 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:24.451805+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 548864 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:25.451972+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 548864 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:26.452166+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 548864 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:27.452383+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 532480 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:28.452550+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 532480 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:29.452721+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 532480 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:30.452917+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70639616 unmapped: 524288 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:31.453130+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70639616 unmapped: 524288 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:32.453273+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 516096 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:33.453438+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 516096 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:34.453646+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 507904 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:35.453817+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 499712 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:36.453971+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 499712 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:37.454127+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 491520 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:38.454262+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 491520 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:39.454453+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 491520 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:40.454621+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 483328 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:41.454755+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 483328 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:42.454901+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 483328 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:43.455040+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 475136 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:44.455211+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 475136 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:45.456243+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 466944 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:46.456909+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 466944 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:47.457174+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 466944 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:48.457345+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 458752 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:49.457479+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 458752 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:50.457938+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 458752 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:51.458455+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 450560 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:52.459277+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 450560 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:53.459523+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 442368 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:54.459728+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 442368 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:55.459872+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 442368 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:56.460440+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 434176 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:57.460675+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 434176 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:58.460964+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 434176 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:29:59.461234+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 425984 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:00.461466+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 425984 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:01.461644+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 425984 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:02.461909+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 417792 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:03.462068+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 417792 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:04.462307+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 417792 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:05.462505+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 409600 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:06.462689+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 409600 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:07.462958+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 401408 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:08.463159+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 401408 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:09.463392+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 401408 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:10.463591+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 401408 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:11.463768+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:12.463968+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:13.464151+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:14.464358+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:15.464561+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:16.464747+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:17.464924+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:18.465144+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:19.465334+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:20.465564+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:21.465787+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:22.465967+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:23.466136+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:24.466263+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:25.466344+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:26.466467+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:27.466605+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:28.466779+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:29.466931+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:30.467094+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 295.191497803s of 297.923675537s, submitted: 5
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 376832 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:31.467256+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [0,0,0,0,1,1,2])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 1384448 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:32.467432+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70836224 unmapped: 1376256 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:33.467597+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70860800 unmapped: 1351680 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904927 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:34.467856+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 1318912 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:35.468000+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 1302528 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:36.468166+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 72048640 unmapped: 163840 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [0,0,0,1,0,0,0,0,2])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:37.468391+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 65536 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:38.468565+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [0,0,0,0,0,0,0,2])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 16384 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904855 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:39.468783+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 999424 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:40.468979+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 966656 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:41.469147+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:42.469318+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:43.469472+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:44.469646+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:45.470004+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:46.470270+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:47.470523+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:48.470845+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:49.471658+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:50.471942+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:51.472451+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:52.472617+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:53.473002+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:54.475124+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:55.475273+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:56.475731+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:57.475894+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:58.476158+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:30:59.476342+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:00.476617+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:01.476915+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:02.477045+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:03.477733+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 925696 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:04.477874+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 925696 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:05.478159+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 925696 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:06.478355+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:07.478644+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:08.479096+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:09.479363+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:10.479548+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:11.479672+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:12.479795+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:13.479919+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:14.480041+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:15.480185+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:16.480341+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:17.480486+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:18.480595+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:19.480738+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:20.480878+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:21.481075+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:22.484480+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:23.485187+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:24.487359+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:25.487854+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:26.488233+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:27.488651+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:28.488995+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:29.489839+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:30.490049+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:31.490422+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:32.490999+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:33.491354+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:34.491821+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:35.491984+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:36.492196+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:37.492535+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:38.492733+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:39.492976+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:40.493239+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:41.493411+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:42.493567+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:43.493689+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:44.493979+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:45.494106+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:46.494333+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:47.494463+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 892928 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:48.494705+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 892928 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:49.494903+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 892928 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:50.495175+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 892928 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:51.495310+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 892928 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:52.495486+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:53.495650+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:54.495810+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:55.495974+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:56.496109+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:57.496236+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:58.496327+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:31:59.496483+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:00.496751+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:01.496920+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:02.497037+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:03.497212+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:04.497362+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:05.497531+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:06.497718+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 876544 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:07.497894+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:08.498029+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:09.498145+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:10.498331+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:11.498428+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:12.498584+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:13.498712+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:14.498825+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:15.498967+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:16.508822+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:17.508968+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:18.509140+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:19.509337+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:20.509507+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:21.509658+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:22.509838+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:23.509999+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:24.510151+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:25.510399+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:26.510570+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:27.511459+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:28.511798+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:29.511967+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:30.512610+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:31.512791+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:32.512965+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:33.513405+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:34.513619+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:35.513757+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:36.513961+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:37.514422+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:38.514870+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:39.515006+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:40.515190+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:41.515363+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:42.515548+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:43.515712+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:44.515891+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:45.516062+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:46.516347+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:47.516506+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:48.516740+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:49.517008+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:50.517220+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:51.517411+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:52.517660+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:53.517825+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:54.517983+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:55.518151+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:56.518311+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:57.518512+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:58.518711+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:32:59.518896+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:00.519092+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:01.519277+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:02.520227+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:03.521143+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:04.521937+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:05.522527+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:06.522961+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 802816 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:07.523434+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 802816 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:08.523870+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 802816 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:09.524252+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 802816 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:10.524703+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 794624 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:11.525074+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 786432 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:12.525357+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 786432 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:13.525545+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:14.525821+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:15.526114+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:16.526270+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:17.526510+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:18.526745+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:19.526918+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:20.527207+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:21.527422+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 745472 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:22.527644+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566c8bc9000 session 0x5566c98185a0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: handle_auth_request added challenge on 0x5566c929a000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 745472 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:23.527843+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 745472 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:24.528088+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 745472 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:25.528330+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 745472 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:26.528543+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:27.528779+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:28.528978+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:29.529176+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:30.529656+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:31.529911+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:32.531453+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:33.531670+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:34.532026+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:35.532226+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:36.532490+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:37.532710+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:38.532930+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:39.533149+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:40.533372+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:41.533522+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:42.534760+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:43.534906+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 712704 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:44.535037+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 712704 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:45.535175+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 712704 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:46.535363+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 696320 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:47.535499+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 696320 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:48.535644+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 696320 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:49.535826+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 696320 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:50.535993+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:51.536128+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:52.536363+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:53.536494+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:54.536629+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:55.536761+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:56.536895+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:57.537055+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:58.537181+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:33:59.537342+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:00.537484+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:01.537647+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:02.537782+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:03.537943+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:04.538092+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:05.538227+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:06.538360+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:07.538533+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:08.538695+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:09.538820+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:10.538980+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:11.539109+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:12.539245+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:13.539410+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:14.539579+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:15.539816+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:16.623436+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:17.623596+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:18.623847+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:19.624050+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:20.624419+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:21.624621+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:22.624941+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 655360 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:23.625196+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 655360 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:24.625426+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 655360 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:25.625705+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 655360 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:26.625913+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:27.626104+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:28.626390+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:29.626639+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:30.626983+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:31.627167+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:32.627363+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:33.627583+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:34.627792+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:35.627988+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:36.628907+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:37.629130+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:38.630135+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:39.630982+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:40.631490+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:41.632157+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:42.632364+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:43.632835+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:44.633157+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:45.633320+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:46.633436+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:47.633722+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:48.633872+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:49.634024+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:50.634340+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:51.634693+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:52.634969+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:53.635099+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:54.635237+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:55.635413+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:56.635538+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:57.635655+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:58.635968+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:34:59.636126+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:00.636466+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:01.636744+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:02.636997+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:03.637283+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:04.637548+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:05.637797+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:06.638061+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:07.638430+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:08.638657+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:09.638840+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:10.639109+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:11.639332+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:12.639522+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:13.639674+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:14.639878+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:15.640128+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:16.640376+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:17.640556+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:18.640742+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 581632 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:19.640916+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:20.641169+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:21.641315+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:22.641521+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:23.641729+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:24.641881+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:25.642064+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:26.642231+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:27.642471+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73711616 unmapped: 598016 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:28.642691+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73711616 unmapped: 598016 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:29.642907+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73711616 unmapped: 598016 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:30.643142+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566ca6b0800 session 0x5566c89854a0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: handle_auth_request added challenge on 0x5566c8bc9000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:31.643377+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:32.643566+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:33.646592+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:34.646804+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:35.646956+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:36.647087+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:37.647239+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:38.647417+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:39.647556+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:40.647717+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:41.647979+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:42.648211+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:43.648379+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:44.648553+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:45.648696+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:46.648832+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:47.649051+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 557056 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:48.649239+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 557056 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:49.649407+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 557056 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:50.649587+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 557056 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:51.649791+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 557056 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:52.650007+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:53.650172+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:54.650371+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:55.650537+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:56.650716+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:57.650981+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:58.651215+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:35:59.651404+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:00.651585+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:01.651786+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:02.651984+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:03.652211+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:04.652442+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:05.652631+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:06.652792+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 532480 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:07.652989+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 532480 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:08.653222+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 532480 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:09.653457+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 524288 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:10.653656+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 524288 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:11.653821+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 524288 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:12.654023+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 524288 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:13.654148+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 524288 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:14.654309+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 516096 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:15.654552+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 516096 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:16.654726+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 507904 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:17.654864+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 507904 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:18.655021+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 507904 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:19.655243+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 507904 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:20.655476+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 507904 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:21.655615+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 499712 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:22.655776+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 499712 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:23.655989+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 499712 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:24.656136+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 499712 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:25.656342+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 499712 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:26.656608+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:27.656799+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:28.656944+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:29.657137+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:30.657337+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:31.657663+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:32.657818+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:33.657977+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:34.658123+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:35.658258+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:36.658467+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:37.658598+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:38.658752+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:39.658887+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:40.659083+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:41.659228+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:42.659360+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:43.659580+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:44.659719+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:45.659861+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:46.660023+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:47.660267+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:48.660502+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:49.660755+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:50.661175+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:51.661339+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:52.661467+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:53.661603+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:54.661747+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:55.661858+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:56.661980+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:57.662171+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:58.662339+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73850880 unmapped: 458752 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:36:59.662477+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:00.662717+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:01.662916+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:02.663083+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:03.663251+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:04.663362+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:05.663497+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:06.663634+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:07.663811+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:08.663946+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:09.664133+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:10.664359+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:11.664485+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:12.664617+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:13.664851+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73883648 unmapped: 425984 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:14.665083+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73883648 unmapped: 425984 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:15.665274+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:16.665507+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:17.665727+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:18.665963+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:19.666188+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:20.666434+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:21.666754+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:22.666972+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:23.667204+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:24.667441+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:25.667697+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:26.667995+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:27.668266+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:28.668499+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:29.668694+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:30.668975+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:31.669174+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:32.669580+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:33.669735+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:34.670046+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:35.670173+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:36.670323+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:37.670492+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:38.670685+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:39.671125+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:40.671382+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:41.671536+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:42.671752+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:43.672030+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:44.672166+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:45.672380+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:46.672533+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:47.672757+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:48.672901+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:49.673065+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:50.673391+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:51.673564+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:52.673776+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:53.673998+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:54.674150+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:55.674329+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:56.674516+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:57.674697+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:58.674870+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:37:59.675071+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:00.675243+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:01.675362+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:02.675502+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:03.675623+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:04.675796+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:05.675993+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 6284 writes, 25K keys, 6284 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 6284 writes, 1144 syncs, 5.49 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 444 writes, 711 keys, 444 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s
                                           Interval WAL: 444 writes, 204 syncs, 2.18 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdd350#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:06.676196+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:07.676374+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:08.676528+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:09.676719+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:10.676948+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:11.677102+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:12.677320+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:13.677463+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:14.677610+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:15.677759+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:16.677881+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:17.678124+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:18.678385+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:19.678579+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:20.678789+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:21.679749+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:22.680609+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:23.681886+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:24.682052+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:25.682230+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:26.682520+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:27.683127+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:28.683342+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:29.683478+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:30.683999+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:31.684437+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:32.684555+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:33.684861+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:34.685196+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:35.685566+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:36.685754+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:37.685914+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:38.686164+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:39.686361+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:40.686914+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:41.687102+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:42.687285+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:43.687526+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:44.687660+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:45.687807+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:46.687985+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:47.688146+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:48.688344+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:49.688568+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:50.688785+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:51.688948+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:52.689126+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:53.689329+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:54.689584+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:55.689783+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:56.689939+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:57.690139+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:58.690333+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:38:59.690498+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:00.690866+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:01.691025+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:02.691398+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:03.691547+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:04.691760+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:05.691895+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:06.692055+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:07.692201+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:08.692403+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:09.692575+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:10.692737+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:11.692922+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:12.693096+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:13.693266+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:14.693477+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:15.693621+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:16.693767+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:17.693906+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:18.694087+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:19.694262+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:20.694519+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:21.694721+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:22.694878+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:23.695030+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:24.695172+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:25.695361+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:26.695570+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:27.695693+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:28.695807+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:29.695969+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:30.696215+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:31.696398+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:32.696537+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:33.696683+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:34.696835+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:35.696985+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:36.697164+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:37.697358+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:38.697502+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:39.697651+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:40.697835+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:41.697972+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:42.698099+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:43.698229+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:44.698379+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:45.698560+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:46.698716+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:47.698855+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:48.699045+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:49.699250+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:50.699515+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:51.699655+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:52.700261+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:53.700424+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:54.700568+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:55.700747+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:56.700918+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:57.701274+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:58.701572+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:39:59.701846+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:00.702116+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:01.702280+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:02.702443+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:03.702611+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:04.702751+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:05.702935+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:06.703132+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:07.703323+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:08.703456+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:09.703590+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:10.703772+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:11.703918+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:12.704097+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:13.704318+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:14.704556+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:15.704792+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:16.704994+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:17.705239+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:18.705412+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:19.705640+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:20.705926+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:21.706115+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:22.706270+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:23.706679+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:24.706815+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:25.707011+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:26.707220+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:27.707403+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:28.707698+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:29.707908+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:30.708144+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 594.295410156s of 600.553833008s, submitted: 240
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:31.708316+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904942 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 221184 heap: 75358208 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:32.708449+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75374592 unmapped: 1032192 heap: 76406784 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:33.708817+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:34.709146+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:35.709385+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:36.710656+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:37.710895+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:38.711078+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:39.711201+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 1957888 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:40.711385+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 1957888 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:41.711716+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 1957888 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:42.711897+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 1957888 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:43.712091+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75505664 unmapped: 1949696 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:44.712267+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75522048 unmapped: 1933312 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:45.712429+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75522048 unmapped: 1933312 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:46.712600+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75522048 unmapped: 1933312 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:47.712781+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:48.712984+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:49.713156+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:50.713481+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:51.713659+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:52.713886+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:53.714092+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:54.714256+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:55.714463+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:56.714589+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:57.714734+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:58.714915+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:40:59.715150+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:00.715450+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:01.715660+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:02.715879+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:03.716076+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 1908736 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:04.716276+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 1908736 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:05.716448+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 1908736 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:06.716611+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 1908736 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:07.716807+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 1908736 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:08.718244+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75554816 unmapped: 1900544 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:09.718676+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75554816 unmapped: 1900544 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:10.718868+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75554816 unmapped: 1900544 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:11.719100+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75554816 unmapped: 1900544 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:12.720065+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:13.720343+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:14.721087+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:15.721336+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:16.721458+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:17.721626+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:18.721789+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:19.721922+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:20.722130+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:21.722368+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:22.722567+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:23.722822+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:24.723032+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:25.723238+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:26.723378+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:27.723715+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:28.723960+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:29.724192+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:30.724501+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:31.724855+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:32.725095+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:33.725401+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75579392 unmapped: 1875968 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:34.725566+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75579392 unmapped: 1875968 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:35.725746+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75579392 unmapped: 1875968 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:36.725944+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:37.726079+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:38.726253+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:39.726419+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:40.726669+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:41.726920+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:42.727091+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:43.727275+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:44.727498+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:45.727693+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:46.727893+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:47.728107+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:48.728281+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:49.728446+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:50.728691+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:51.728858+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:52.729023+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:53.729212+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:54.729388+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:55.729543+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:56.729765+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:57.729922+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:58.730072+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:41:59.730237+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:00.730420+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:01.730577+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:02.730710+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:03.730851+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:04.731001+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:05.731134+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:06.731349+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:07.731520+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:08.731676+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:09.731789+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:10.731956+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:11.732125+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:12.732281+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:13.732482+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:14.732613+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:15.732747+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:16.732887+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:17.733032+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:18.733167+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:19.733400+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:20.733568+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:21.733760+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:22.733904+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:23.734030+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:24.734224+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:25.734417+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:26.734685+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:27.734881+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:28.735062+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:29.735281+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:30.735567+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:31.735804+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:32.736040+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:33.736249+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:34.736431+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:35.736652+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:36.736916+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:37.737177+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:38.737499+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:39.737822+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:40.738136+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:41.738406+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:42.738586+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:43.738811+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:44.739037+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:45.739495+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:46.739899+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:47.740094+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:48.740390+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:49.740650+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:50.741429+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:51.741663+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:52.741997+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:53.742353+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:54.742554+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:55.742723+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:56.743014+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75644928 unmapped: 1810432 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:57.743226+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75644928 unmapped: 1810432 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:58.743442+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75653120 unmapped: 1802240 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:42:59.743652+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75653120 unmapped: 1802240 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:00.743936+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75653120 unmapped: 1802240 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:01.744359+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75653120 unmapped: 1802240 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:02.744556+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:03.744896+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:04.745163+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:05.745357+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:06.745602+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:07.745866+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:08.746081+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:09.746401+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:10.746686+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:11.746883+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:12.747049+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:13.747265+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:14.747507+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:15.747693+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:16.747864+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:17.748040+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:18.748181+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:19.748356+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:20.748600+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:21.748788+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:22.748928+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:23.749139+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:24.749359+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:25.749563+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:26.749712+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:27.749938+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:28.750179+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:29.750426+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:30.750673+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:31.750860+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:32.751060+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:33.751222+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:34.751424+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:35.751648+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:36.751789+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:37.751992+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:38.752140+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:39.752378+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:40.752568+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:41.752736+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:42.752935+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:43.753211+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75702272 unmapped: 1753088 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:44.753410+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75702272 unmapped: 1753088 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:45.753602+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75702272 unmapped: 1753088 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:46.753767+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75702272 unmapped: 1753088 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:47.753926+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75702272 unmapped: 1753088 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:48.754070+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:49.754237+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:50.754573+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:51.754757+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:52.755024+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:53.755258+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:54.755497+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:55.755674+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:56.755857+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:57.756026+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:58.756351+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:43:59.756585+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:00.756864+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:01.757015+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:02.757179+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:03.757351+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:04.757566+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:05.757807+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:06.757933+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:07.758077+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:08.758351+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:09.758517+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:10.758759+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:11.759233+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:12.759430+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:13.759637+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:14.759796+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:15.759980+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:16.760142+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:17.760333+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:18.760505+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:19.760758+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:20.761004+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:21.761166+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:22.761402+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:23.761627+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:24.761836+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:25.762098+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:26.762251+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:27.762459+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:28.762681+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:29.762885+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:30.763112+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:31.763276+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:32.763486+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:33.763728+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:34.763934+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:35.764356+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:36.764531+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:37.764763+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:38.764971+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:39.765228+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:40.765510+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:41.765671+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:42.765822+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:43.766060+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:44.766261+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:45.766501+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:46.766715+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:47.766912+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:48.767089+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:49.767451+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:50.767702+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:51.767931+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:52.768137+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:53.768342+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:54.768508+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75767808 unmapped: 1687552 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:55.768684+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75767808 unmapped: 1687552 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:56.768865+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75767808 unmapped: 1687552 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:57.769240+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:58.769462+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:44:59.769712+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:00.769923+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:01.770069+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:02.770329+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:03.770569+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:04.770741+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:05.771025+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:06.771228+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:07.771447+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:08.771690+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:09.771955+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:10.772202+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:11.772404+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:12.772617+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:13.772862+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:14.773066+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:15.773284+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:16.773503+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:17.773692+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:18.773896+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:19.774078+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:20.774367+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:21.774588+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:22.774777+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:23.775013+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:24.775280+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:25.775528+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:26.775759+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:27.775908+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:28.776105+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:29.776325+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:30.776548+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:31.776750+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:32.777017+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 1638400 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:33.777219+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 1638400 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:34.777520+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 1638400 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:35.777747+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 1638400 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:36.777938+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:37.778143+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:38.778386+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:39.778556+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:40.778713+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:41.778928+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:42.779103+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:43.779391+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:44.779610+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:45.779842+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:46.780109+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:47.780528+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:48.780816+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:49.781016+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:50.781234+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:51.781384+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:52.781549+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:53.781759+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:54.781992+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:55.782204+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:56.782550+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:57.782808+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:58.783022+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:45:59.783183+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:00.783403+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:01.783677+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:02.783850+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:03.784026+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:04.784256+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:05.784541+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:06.784704+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:07.784873+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:08.785067+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:09.785334+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:10.785597+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:11.785757+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:12.785970+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:13.786194+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:14.786472+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:15.786682+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:16.786927+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:17.787114+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:18.787380+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:19.787653+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:20.787955+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:21.788154+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:22.788370+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:23.788501+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:24.788863+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:25.789094+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:26.790012+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:27.790193+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:28.790780+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:29.790959+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:30.791325+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:31.791491+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:32.792550+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:33.792778+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:34.793056+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:35.793394+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:36.793718+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:37.794075+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:38.794346+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:39.794527+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:40.794781+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:41.794968+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:42.795150+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:43.795413+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:44.795576+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:45.795803+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:46.796090+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:47.796360+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:48.796566+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:49.796842+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:50.797179+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:51.797391+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:52.797519+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:53.797727+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:54.797983+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:55.798246+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:56.798420+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:57.798579+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:58.798771+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:46:59.798954+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:00.799195+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:01.799371+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:02.799513+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:03.799643+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:04.799798+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:05.799975+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:06.800127+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:07.800279+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:08.800433+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:09.800574+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:10.800787+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:11.800935+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:12.801099+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:13.801343+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75915264 unmapped: 1540096 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:14.801579+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75915264 unmapped: 1540096 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:15.801730+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75915264 unmapped: 1540096 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:16.801925+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:17.802083+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:18.802237+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:19.802386+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:20.802588+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:21.802748+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:22.802918+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:23.803072+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:24.803268+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:25.803503+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:26.803694+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:27.803819+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:28.803986+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:29.804833+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:30.805489+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:31.805748+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:32.806216+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:33.806755+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:34.807400+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:35.807619+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:36.807906+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:37.808081+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:38.808349+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:39.808573+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:40.808909+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:41.809219+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:42.809484+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:43.809710+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:44.809962+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:45.810226+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:46.810482+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:47.810722+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:48.810918+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:49.811164+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:50.811446+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:51.811604+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:52.811783+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:53.811944+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:54.812134+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:55.812359+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:56.812547+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:57.812692+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:58.812833+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:47:59.812989+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:00.813181+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:01.813345+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:02.813507+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:03.813653+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:04.813863+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:05.814035+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 6783 writes, 26K keys, 6783 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 6783 writes, 1386 syncs, 4.89 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 499 writes, 770 keys, 499 commit groups, 1.0 writes per commit group, ingest: 0.25 MB, 0.00 MB/s
                                           Interval WAL: 499 writes, 242 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:06.814247+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:07.814431+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:08.815762+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:09.816039+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:10.816341+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:11.816597+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:12.816719+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:13.816952+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:14.817102+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:15.817315+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: mgrc ms_handle_reset ms_handle_reset con 0x5566c6fd9c00
Nov 29 06:58:51 compute-1 ceph-osd[78089]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1221624088
Nov 29 06:58:51 compute-1 ceph-osd[78089]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1221624088,v1:192.168.122.100:6801/1221624088]
Nov 29 06:58:51 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:51 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:51 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:51.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: get_auth_request con 0x5566ca6b1800 auth_method 0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: mgrc handle_mgr_configure stats_period=5
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:16.817483+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76136448 unmapped: 1318912 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:17.817635+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76136448 unmapped: 1318912 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:18.817783+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76136448 unmapped: 1318912 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:19.817967+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76136448 unmapped: 1318912 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:20.818190+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76136448 unmapped: 1318912 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566c8bb9000 session 0x5566c7db7c20
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: handle_auth_request added challenge on 0x5566ca6b0800
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:21.818371+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76152832 unmapped: 1302528 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:22.818553+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566c929a000 session 0x5566caee6780
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: handle_auth_request added challenge on 0x5566c929b800
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566c929a400 session 0x5566c7db7680
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: handle_auth_request added challenge on 0x5566c929a000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 1294336 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:23.818737+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 1294336 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:24.818929+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 1294336 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:25.819126+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:26.819316+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:27.819531+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:28.819666+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:29.819901+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:30.820130+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:31.820455+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:32.820652+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:33.820861+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:34.821000+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:35.821167+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:36.821331+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:37.821515+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:38.821686+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:39.821890+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:40.822103+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:41.822323+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:42.822459+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:43.822639+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:44.822804+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:45.822975+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:46.823117+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:47.823253+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:48.823418+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:49.823530+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:50.823729+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:51.823856+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:52.824003+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:53.824183+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:54.824413+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:55.824577+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:56.824748+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:57.824976+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:58.825196+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:48:59.825398+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:00.825651+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:01.825852+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:02.826042+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:03.826193+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:04.826352+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:05.826549+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:06.826783+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:07.827029+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:08.827218+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:09.827396+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:10.827670+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:11.827854+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:12.828075+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:13.828240+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:14.828390+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:15.828564+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:16.828738+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:17.828894+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:18.829153+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:19.829444+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:20.829671+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:21.829910+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:22.830113+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:23.830340+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:24.830550+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:25.830798+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:26.831055+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:27.831247+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:28.831445+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:29.831591+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:30.831817+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:31.832084+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:32.832246+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:33.832470+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:34.832696+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:35.832894+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:36.833040+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:37.833722+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:38.833900+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:39.834049+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:40.834256+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:41.834509+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:42.834657+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:43.834795+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:44.834988+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:45.835141+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:46.835348+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:47.835504+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:48.835661+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:49.835808+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:50.835971+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:51.836113+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:52.836327+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:53.836502+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:54.836710+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:55.836810+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:56.836961+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:57.837099+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:58.837229+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 1204224 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:49:59.837378+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 1204224 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:00.837570+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 1204224 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:01.837715+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 1187840 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:02.837865+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 1187840 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:03.838017+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 1187840 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:04.838188+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 1187840 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:05.838394+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:06.838579+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:07.838717+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:08.838910+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:09.839118+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:10.839342+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:11.839537+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:12.839773+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:13.839946+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:14.840174+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:15.840383+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:16.840637+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:17.841055+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:18.841248+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:19.841455+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:20.841699+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:21.841953+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:22.842140+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:23.842388+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:24.842545+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:25.842705+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:26.842933+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:27.843172+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:28.843405+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:29.843623+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566c8bc9000 session 0x5566cb593e00
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: handle_auth_request added challenge on 0x5566cb826000
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:30.843859+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 1155072 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 599.411193848s of 600.292236328s, submitted: 257
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:31.844095+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [0,0,0,0,0,0,0,0,0,4,2])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:32.844502+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:33.844716+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 1114112 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:34.844874+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [0,0,0,1])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76488704 unmapped: 966656 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:35.845013+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 876544 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:36.845210+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77660160 unmapped: 1892352 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:37.845435+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77717504 unmapped: 1835008 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:38.845635+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:39.845864+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:40.846155+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:41.846347+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:42.846578+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:43.847510+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:44.848201+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:45.848581+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:46.848991+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:47.849424+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:48.849577+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:49.849826+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:50.850151+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:51.850370+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:52.850600+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:53.850855+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:54.851076+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:55.851241+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:56.851810+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:57.852226+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:58.852548+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:50:59.852731+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:00.853083+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 1810432 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:01.853247+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 1810432 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:02.853448+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77750272 unmapped: 1802240 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:03.853625+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77750272 unmapped: 1802240 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:04.853777+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77750272 unmapped: 1802240 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:05.853928+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:06.854060+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:07.854227+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:08.854509+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:09.854729+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:10.854937+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:11.855144+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:12.855332+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:13.855497+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:14.855691+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:15.855959+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:16.856239+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:17.856408+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:18.856632+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:19.856917+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:20.857209+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:21.857474+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:22.857774+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:23.858009+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:24.858238+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:25.858447+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:26.858708+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:27.858989+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:28.859206+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:29.859394+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:30.859675+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:31.859883+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:32.860076+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:33.860386+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:34.860562+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:35.860780+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:36.861058+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:37.861219+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:38.861392+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:39.861538+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:40.861746+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:41.861972+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:42.862185+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:43.862365+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:44.862531+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:45.862673+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:46.862823+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:47.863056+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:48.863232+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:49.863410+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:50.863663+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:51.863816+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:52.863949+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:53.864185+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:54.864461+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:55.864706+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:56.864875+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:57.865085+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:58.865306+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:51:59.865484+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:00.865685+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:01.865862+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:02.866034+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:03.866266+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:04.866493+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:05.866775+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:06.866942+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:07.867155+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:08.867415+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:09.867636+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:10.867870+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:11.868046+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:12.868338+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:13.868559+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:14.868770+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:15.868970+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:16.869180+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:17.869480+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:18.869784+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:19.869893+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:20.870155+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:21.870414+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:22.870626+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:23.870837+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:24.871019+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:25.871150+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:26.871369+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:27.871607+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:28.871792+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:29.871969+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:30.872249+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:31.872452+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:32.872674+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:33.872814+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:34.872992+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:35.873149+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:36.873335+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:37.873570+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:38.873903+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:39.874106+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:40.874380+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:41.874613+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:42.874772+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:43.874923+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:44.875132+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:45.875406+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:46.875624+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:47.875887+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:48.876045+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:49.876230+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:50.876451+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:51.876654+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:52.876877+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:53.877060+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:54.877257+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:55.877438+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:56.877677+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:57.877939+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:58.878135+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:52:59.878395+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:00.878663+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:01.878903+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:02.879101+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:03.879405+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:04.879652+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:05.879909+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:06.880112+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:07.880396+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:08.880595+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:09.880843+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:10.881114+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:11.881344+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:12.881563+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:13.882106+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:14.882489+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:15.882708+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:16.882978+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:17.883171+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:18.883390+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:19.883600+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:20.883895+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:21.884094+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:22.884321+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:23.884508+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:24.884768+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:25.884957+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:26.885172+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:27.885416+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:28.885684+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:29.885955+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:30.886265+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:31.886531+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:32.886804+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:33.887031+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:34.887234+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:35.887418+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:36.887697+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:37.887983+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:38.888248+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:39.888555+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:40.888927+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:41.889202+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:42.889435+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:43.889686+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:44.889949+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:45.890178+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:46.890409+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:47.890614+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:48.890773+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:49.891009+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:50.891318+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:51.891520+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:52.891706+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:53.891896+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:54.892094+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:55.892275+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:56.892463+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:57.892673+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:58.892936+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:53:59.893130+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:00.893395+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:01.893593+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:02.893826+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:03.894023+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:04.894218+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:05.894390+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:06.894608+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:07.894809+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:08.895019+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:09.895194+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:10.895404+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:11.895547+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:12.895734+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:13.895949+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:14.896118+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:15.896373+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:16.896588+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:17.896737+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:18.896909+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:19.897080+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:20.897518+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:21.897737+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:22.897925+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:23.898211+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:24.898448+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:25.898662+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:26.898971+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:27.899455+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:28.900118+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:29.900399+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:30.901077+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:31.901635+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:32.902025+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:33.902707+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:34.903231+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:35.903820+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:36.904351+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:37.904781+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:38.905363+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:39.905842+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:40.906411+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:41.906918+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:42.907352+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:43.907831+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:44.908242+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:45.908604+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:46.908951+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:47.909270+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:48.909939+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:49.910109+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:50.910391+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:51.910732+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:52.911052+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:53.911416+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:54.911660+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:55.911897+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:56.912124+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:57.912327+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:58.912516+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:54:59.912682+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:00.912922+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:01.913212+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:02.913465+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:03.913743+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:04.914013+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:05.914273+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:06.914632+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:07.914926+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:08.915149+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:09.915439+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:10.915693+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:11.915950+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:12.916253+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:13.916482+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:14.916741+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:15.916948+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:16.917176+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:17.917358+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:18.917578+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:19.917886+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:20.918179+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:21.918395+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:22.918606+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:23.918833+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:24.919115+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:25.919369+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:26.919574+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:27.919831+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:28.920105+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:29.920281+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:30.920587+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:31.920900+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:32.921203+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:33.921414+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:34.921664+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:35.921916+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:36.922185+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:37.922440+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:38.922652+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:39.922925+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:40.923210+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:41.923520+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:42.923836+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:43.924023+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:44.924434+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:45.924686+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:46.924941+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:47.925156+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:48.925324+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:49.925553+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:50.925822+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:51.925988+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:52.926124+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:53.926247+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:54.926399+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:55.926584+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:56.926759+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:57.926941+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:58.927184+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:55:59.927364+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:00.927628+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:01.927859+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:02.928060+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:03.928209+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:04.928380+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:05.928537+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:06.928817+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:07.928974+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:08.929198+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:09.929452+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:10.929763+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:11.930005+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:12.930177+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:13.930392+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:14.930612+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:15.930799+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:16.931042+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:17.931256+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:18.931507+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:19.931699+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:20.931890+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:21.932146+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:22.932365+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:23.932537+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:24.932730+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:25.932905+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:26.933113+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:27.933332+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:28.933537+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:29.933753+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:30.933968+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:31.934126+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:32.934410+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:33.934587+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:34.934859+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:35.935058+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:36.935372+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:37.935642+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:38.935832+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:39.936027+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:40.936379+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:41.936650+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:42.936866+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:43.937060+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:44.937261+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:45.937437+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:46.937631+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:47.937793+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:48.937969+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:49.938233+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:50.938526+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:51.938752+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:52.938951+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:53.939236+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:54.939438+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:55.939679+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:56.939836+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:57.940017+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:58.940221+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:56:59.940451+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:00.940690+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:01.940837+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:02.940983+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:03.941246+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:04.941446+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:05.941639+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:06.941843+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:07.942740+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:08.943455+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:09.943988+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:10.944435+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:11.944751+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:12.945001+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:13.945211+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:14.945393+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:15.945814+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:16.946349+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:17.946904+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:18.947119+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:19.947413+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:20.947830+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:21.948091+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:22.948398+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:23.948679+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:24.948912+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:25.949112+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:26.949274+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:27.949515+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:28.949824+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:29.950107+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:30.950362+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:31.950580+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:32.950726+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:33.950947+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:34.951102+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:35.951275+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:36.951534+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:37.951790+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:38.952000+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:39.952201+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:40.952480+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:41.952671+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:42.952865+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:43.953080+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:44.953262+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:45.953500+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:46.953720+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:47.953931+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:48.954134+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:49.954368+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:50.954670+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:51.954828+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:52.954974+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:53.955127+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:54.955348+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:55.955536+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:56.955700+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:57.955978+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:58.956120+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:57:59.956318+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:00.956581+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:01.956837+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:02.957006+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:03.957232+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:04.957482+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:05.957673+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.1 total, 600.0 interval
                                           Cumulative writes: 7210 writes, 27K keys, 7210 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 7210 writes, 1584 syncs, 4.55 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 427 writes, 658 keys, 427 commit groups, 1.0 writes per commit group, ingest: 0.21 MB, 0.00 MB/s
                                           Interval WAL: 427 writes, 198 syncs, 2.16 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:06.957890+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:07.958088+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:08.958329+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:09.958647+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:10.958932+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:11.959094+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:12.959330+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:13.959454+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:14.959572+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:15.959756+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:16.959882+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:17.960007+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 1540096 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: do_command 'config diff' '{prefix=config diff}'
Nov 29 06:58:51 compute-1 ceph-osd[78089]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:18.960119+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: do_command 'config show' '{prefix=config show}'
Nov 29 06:58:51 compute-1 ceph-osd[78089]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 29 06:58:51 compute-1 ceph-osd[78089]: do_command 'counter dump' '{prefix=counter dump}'
Nov 29 06:58:51 compute-1 ceph-osd[78089]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 29 06:58:51 compute-1 ceph-osd[78089]: do_command 'counter schema' '{prefix=counter schema}'
Nov 29 06:58:51 compute-1 ceph-osd[78089]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 78512128 unmapped: 2088960 heap: 80601088 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:19.960281+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 1982464 heap: 80601088 old mem: 2845415833 new mem: 2845415833
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: tick
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_tickets
Nov 29 06:58:51 compute-1 ceph-osd[78089]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-29T06:58:20.960474+0000)
Nov 29 06:58:51 compute-1 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 06:58:51 compute-1 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 06:58:51 compute-1 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 06:58:51 compute-1 ceph-osd[78089]: do_command 'log dump' '{prefix=log dump}'
Nov 29 06:58:52 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:58:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:53.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:58:53 compute-1 crontab[234325]: (root) LIST (root)
Nov 29 06:58:53 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/258398947' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:58:53 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/532443902' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 29 06:58:53 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/4111572753' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:58:53 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2584480393' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 29 06:58:53 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3758656299' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 06:58:53 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/844408131' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 29 06:58:53 compute-1 ceph-mon[80754]: from='client.24970 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:53 compute-1 ceph-mon[80754]: from='client.24893 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:58:53 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1662226395' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:58:53 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/4145305371' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 29 06:58:53 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:53 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:53 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:53.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:53 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Nov 29 06:58:53 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1233692885' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:58:53 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Nov 29 06:58:53 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3946325939' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 29 06:58:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Nov 29 06:58:54 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3661981742' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:58:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Nov 29 06:58:54 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3635354810' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:58:54 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Nov 29 06:58:54 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3429457872' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:58:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 06:58:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:55.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 06:58:55 compute-1 podman[234545]: 2025-11-29 06:58:55.448766129 +0000 UTC m=+0.075202774 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:58:55 compute-1 podman[234544]: 2025-11-29 06:58:55.459183479 +0000 UTC m=+0.082366326 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 06:58:55 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:55 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:55 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:55.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:55 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Nov 29 06:58:55 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1183724430' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:58:56 compute-1 systemd[1]: Starting Hostname Service...
Nov 29 06:58:56 compute-1 systemd[1]: Started Hostname Service.
Nov 29 06:58:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:57.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:57 compute-1 sudo[234768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:58:57 compute-1 sudo[234768]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:57 compute-1 sudo[234768]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:57 compute-1 sudo[234795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:58:57 compute-1 sudo[234795]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:57 compute-1 sudo[234795]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/3995147939' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/2877614875' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/2517383288' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/1377123572' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3003957364' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1120183002' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: pgmap v1392: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:57 compute-1 ceph-mon[80754]: pgmap v1393: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/1383921042' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2433791124' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2594614874' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1233692885' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3946325939' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/3661981742' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:58:57 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3783031133' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:58:57 compute-1 sudo[234824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:58:57 compute-1 sudo[234824]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:57 compute-1 sudo[234824]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:57 compute-1 sudo[234858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --image quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 --timeout 895 ls
Nov 29 06:58:57 compute-1 sudo[234858]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:58:57 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:57 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:57 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:57.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:57 compute-1 podman[234983]: 2025-11-29 06:58:57.858548207 +0000 UTC m=+0.096372834 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Nov 29 06:58:57 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:58:57 compute-1 podman[234983]: 2025-11-29 06:58:57.957652022 +0000 UTC m=+0.195476629 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 06:58:58 compute-1 sudo[234858]: pam_unix(sudo:session): session closed for user root
Nov 29 06:58:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:59.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:58:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Nov 29 06:58:59 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3562221273' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Nov 29 06:58:59 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Nov 29 06:58:59 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2432053198' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:58:59 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:58:59 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:58:59 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:59.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:59:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Nov 29 06:59:00 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1177386484' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:59:00 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Nov 29 06:59:00 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/255758300' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Nov 29 06:59:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:59:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:59:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:59:01.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.15078 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.15093 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/1819404255' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.24944 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3635354810' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.25021 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1423480369' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/3266793758' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.24950 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3429457872' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.25033 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/634521314' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/4117577449' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.24962 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: pgmap v1394: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.25039 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/363659733' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.24968 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.25057 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.15147 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.15153 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.24983 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.25066 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/4186939961' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.15168 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:01 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3606582697' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 29 06:59:01 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:59:01 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:59:01 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:59:01.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:59:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Nov 29 06:59:02 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/269676052' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Nov 29 06:59:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Nov 29 06:59:02 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1938043130' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Nov 29 06:59:02 compute-1 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 06:59:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:59:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 06:59:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:59:03.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 06:59:03 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Nov 29 06:59:03 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4256520886' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Nov 29 06:59:03 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Nov 29 06:59:03 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/585662946' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Nov 29 06:59:03 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Nov 29 06:59:03 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1903066086' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Nov 29 06:59:03 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:59:03 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:59:03 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:59:03.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:59:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Nov 29 06:59:04 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1465321967' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Nov 29 06:59:04 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1200614993' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/1183724430' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/800220038' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.24989 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.25075 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.25081 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.25087 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.24998 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.15186 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: pgmap v1395: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.25096 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.25010 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.25105 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.25016 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.25117 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.25028 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2561335456' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1523353195' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: pgmap v1396: 305 pgs: 305 active+clean; 456 KiB data, 153 MiB used, 21 GiB / 21 GiB avail
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.15207 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/1840177804' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/3562221273' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.101:0/2432053198' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.15219 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/2442087058' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.102:0/4281976021' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/95145201' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.15231 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/2283979250' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/871154065' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: from='client.? 192.168.122.100:0/3338702776' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Nov 29 06:59:04 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0) v1
Nov 29 06:59:04 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3267749641' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Nov 29 06:59:04 compute-1 sudo[235697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:59:04 compute-1 sudo[235697]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:59:04 compute-1 sudo[235697]: pam_unix(sudo:session): session closed for user root
Nov 29 06:59:04 compute-1 sudo[235727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 29 06:59:04 compute-1 sudo[235727]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:59:04 compute-1 sudo[235727]: pam_unix(sudo:session): session closed for user root
Nov 29 06:59:04 compute-1 sudo[235773]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Nov 29 06:59:04 compute-1 sudo[235773]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:59:04 compute-1 sudo[235773]: pam_unix(sudo:session): session closed for user root
Nov 29 06:59:04 compute-1 sudo[235815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/cephadm.31206ab20142c8051b6384b731ef7ef7af2407447fac35b7291e90720452ed8d --timeout 895 gather-facts
Nov 29 06:59:04 compute-1 sudo[235815]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Nov 29 06:59:05 compute-1 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 06:59:05 compute-1 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 06:59:05 compute-1 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:59:05.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 06:59:05 compute-1 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Nov 29 06:59:05 compute-1 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/46235445' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
